Blog > Knowledge > Post

Things You Must Know About Data Harvesting & Data Mining

Monday, January 25, 2021

 Since the phrase "Big Data" went viral, everything related to data sprang up. Web scrapingweb harvestingweb miningdata analysisdata mining, etc. These words have been used interchangeably that make the realm of data get even more confusing for many people. A comprehensive understanding of these terms is necessary for respective businesses to be well-informed in the cutthroat marketing industry. 


What is Data Harvesting?

Data harvesting means getting the data and information from the online resource. It is usually interchangeable with web scraping, web crawling, and data extraction. Collecting is an agricultural term which means to gather ripe crops from the fields which involve the act of collection and relocation. Data harvesting is the process to extract valuable data out of target websites and put them into your database in a structured format. 


To conduct data harvesting, you need to have an automated crawler to parse the target websites, capture valuable information, extract the data and finally export into a structured format for further analysis. Data harvesting, therefore, doesn't involve algorithms, machine learning, nor statistics. Instead, it relies on computer programming like Python, R, Java, to function. Besides, data harvesting is more about being accurate. 


There are many data extraction tools and service providers that can conduct web harvesting for you. Octoparse stands out as the best web scraping tool. Whether you are a first-time self-starter or an experienced programmer, it is the best choice to harvest the data from the internet. 


What is Data Mining?

Data mining is often misunderstood as a process to obtain the data. There are substantial differences between collecting the data and mining the data even though both of them involve the act of extraction and obtaining. Data mining is the process to discover fact-based patterns you generate from a large set of data. Rather than just getting the data and making sense of it, data mining is interdisciplinary, which integrates statistics, computer science, and machine learning.


The famous Cambridge Analytica Scandal, they collected over 60 million Facebook Users information and isolated out those who were uncertain about their votes based on their identity and activities on Facebook. Cambridge Analytica then employed "Psychographic microtargeting" tactic to bombard them with inflammatory messages to shift their votes. It is a typical yet harmful application of data mining. Data mining discovers who they are, what they do, and in return, help to achieve the goal. It sounds like magic, yet complicated.


Data mining has Four Key Applications. The first one is the classification. Just like the word implies, data mining is used to put things or people into different categories for further analysis. For example, the bank builds up a model of classification through applications. They gather millions of applications along with each individuals' bank statements, job titles, marital status, school diploma, etc, then use algorithms to calculate and decide which application is riskier than the others. That said, at the moment you fill out the application form, they already know what category you belong to, and what loan applies to you.  


  • Regression

    Regression is used to predict the trend based on numerical values from the datasets. It is a statistical analysis of the relationship between variables. For example, you can predict how likely does the crime occur in a specific area based on historical records. 

  • Clustering

    Cluster is to group data points based on similar traits or values. For example, Amazon groups similar products together based on each item's description, tags, functions for customers to identify easier.

  • Anomaly detection

    It is the process to detect abnormal behaviors which are also called outliers. Banks employ this method to detect unusual transactions that don't fit into your normal transaction activities. 

  • Association learning

    Association learning answers the question of "how does the value of one feature relate to that of another?" For example, in grocery stores, people who buy soda are more likely to buy Pringles together. Market basket analysis is a popular application of association rules. It helps retailers to identify the relationships of consuming products. 


These four applications build the backbone of Data Mining. So to speak, data mining is the core of the Big Data. The process of data mining is also conceived as Knowledge Discovery from Data (KDD). It illuminates the concept of data science, which helps study research and knowledge discovery. Data can be structured or unstructured and scattered over the internet. The real power is when each piece is grouped, set apart between categories so we can draw a pattern, predict the trends and detect abnormalities.



Author: Ashley

Ashley is a data enthusiast and passionate blogger with hands-on experience in web scraping. She focuses on capturing web data and analyzing in a way that empowers companies and businesses with actionable insights. Read her blog here to discover practical tips and applications on web data extraction

Si desea ver el contenido en español, por favor haga clic en: Data Harvesting y Data Mining: ¿Cuál es la diferencia?  También puede leer artículos de web scraping en El Website Oficial









We use cookies to enhance your browsing experience. Read about how we use cookies and how you can control them by clicking cookie settings. If you continue to use this site, you consent to our use of cookies.
Accept decline