Scraping Data from Website to ExcelFriday, February 24, 2017
(picture from thewindowsbulletin.com)
Excel is a powerful tool for dealing with data in a structured format. It’s easy to use Excel to sort, apply filters, use charts, use group and outline data, use the pivot table for some basic analysis. What’s more, you can build regression models and more capabilities and features for manipulating, analyzing, presenting data in Excel. In most cases, we need to scrape large amounts of refreshable data from websites to gain insight from the website information by importing the data into Excel and analyzing it. In this article, I will introduce several ways which will save your time and energy to scrape web data into Excel.
Getting web data using Excel Web Queries
Apart from copying and pasting data from a web page, Excel Web Queries is basically used to quickly retrieve data from a standard web page into the Excel worksheet. It can automatically find web tables on the webpage and you can choose one of the tables you need. Web queries can also be used for situations where a standard ODBC(Open Database Connectivity) connection would be really hard to create or maintain for just getting data from the web pages. You can directly scrape a table from any website using Excel Web Queries.
The process boils down to several simple steps (Check out this article):
1. Go to Data > Get External Data > From Web
2. A browser window named “New Web Query” will appear
3. In the address bar, write the web address
(picture from excel-university.com)
4. The page will load and will show yellow icons against data/tables.
5. Select the appropriate one
6. Press the Import button.
Now you have the web data scraped into the Excel Worksheet - perfectly arranged in rows and columns as you like.
This method limits you only to the information provided on that page. Since the mobile number and email id are not available here, you cannot acquire them this way.
Getting web data using Excel VBA
Most of us would use formula's in Excel(e.g. =avg(...), =sum(...), =if(...), etc.) a lot, but less familiar with the built-in language - Visual Basic for Application a.k.a VBA. It’s commonly known as “Macros” and such Excel files are saved as a **.xlsm. Before using it, you need to first enable the Developer tab in the ribbon (right click File -> Customize Ribbon -> check Developer tab). Then set up your layout. In this developer interface, you can write VBA code attached to various events. Click HERE (https://msdn.microsoft.com/en-us/library/office/ee814737(v=office.14).aspx) to getting started with VBA in excel 2010.
Using Excel VBA is going to get a bit technical - this is not very friendly for the non-programmers among us. VBA works by running macros, step-by-step procedures written in Excel Visual Basic. To scrape data from websites to Excel using VBA, we need to build or get some VBA script to send some request to web pages and get returned data from these web pages. It’s common to use VBA with XMLHTTP and regular expressions to parse the web pages. For Windows, you can use VBA with WinHTTP or InternetExplorer to scrape data from websites to Excel.
With some patience and some practice, you would find it worthwhile to learn some Excel VBA code and some HTML knowledge to make your web scraping into Excel much easier and more time-saving for automating the repetitive work. There’s a plentiful amount of material and learning forums for you to learn how to write VBA code online and you could search and use some ready-to-use VBA script directly as well.
Automated Web Scraping Tools
For someone who is just looking for a quick tool to scrape data off pages to Excel and not trying to set up the VBA code yourself, I’d strongly recommend automated web scraping tools (https://www.octoparse.com/) to scrape data for your Excel Worksheet directly or via API. No need to learn to code and most of the web scraping tools are easy to use. You can pick one of those web scraping freeware to get started with getting data from websites immediately and export the data scraped into Excel. Different web scraping tools would have its pros and cons and you can choose the perfect one to make your project work.
Check out this post and try out these TOP 30 free web scraping tools.
Outsource Your Web Scraping Project
If time is your most valuable asset and you want to focus on the core businesses, outsourcing the complicated web scraping work to a proficient web scraping team that has experience and expertise would be the best choice. Since it’s quite difficult to scrape data from websites due to some restriction or some copyright notices for some websites or portals, proficient web scraping teams would help you get data from websites in a right way and deliver the data scraped to you as Excel file or any data format you need.
Build a Web Scraper Using Programming Language
Generally, some people, especially programmer, would write up a script in some programming language such as PHP, Python, Perl, Ruby and etc. and import the data extracted into Excel when they want to scrape web pages for data. In this case, you can just ignore the post and view other posts you may be interested in.
(picture from myskilldomain.com)
Author: The Octoparse Team
Web Scraping: How It All Started And Will Be
Data Insight: 54 Industries Using Web Scraping
Most popular posts
- Related articles
- Scraping Data from Website to Excel
- Free Online Web Crawler Tools
- 20 Most Popular Business Intelligence (BI) To...
- Python - HTML Parser? You Need to Know XPath
- Cragslist CAPTCHA Bypass