SECONDO researchers announced the completion of the crawler, which constitutes a significant part of the project, at the Bi-weekly general call. The crawler scans the Dark Web and Twitter for essential data, that could endanger an organization.
The collected and processed data will be specified and quantified within a metamodel by the researchers. This task will bundle a set of algorithms that will be used to perform sophisticated analysis on top of aggregated data whenever required during the project course.
These algorithms have been classified into three categories; regression analysis, predictive and prescriptive analysis, and lastly data mining and machine learning.
Cyberwatching.eu has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 740129. The content of this website does not represent the opinion of the European Commission, and the European Commission is not responsible for any use that might be made of such content. Privacy Policy | Disclaimer / Terms and Conditions of Use