Whether a digital native or immigrant, you probably know the basic functions of Excel inside out. With Excel, it is easy to accomplish simple tasks like sorting, filtering and outlining data and making charts based on them. When the data are highly structured, we can even perform advanced data analysis using pivot and regression models in Excel.
But the problem is, how can we extract scalable data and put them into Excel efficiently? This would be an extremely tedious task if done manually by repetitive typing, searching, copying, and pasting. So, how can we achieve automated extraction and scraping data from websites to Excel? Reading the following parts, and you can find 3 different solutions.
Method 1: Automated Web Scraping Tool
If you are looking for an easy tool to scrape data from web pages to Excel but don’t know about coding, then you can try Octoparse – the best web scraping tool, which can scrape website data and export them into Excel worksheets either directly or via API. Octoparse works for both Windows and macOS devices, it provides auto-detecting mode for almost all websites and preset templates for popular sites so that you can extract data into Excel without coding skills. Follow the simple guide below.
Steps to Extract data from website to Excel with Octoparse
Step 1: Paste target website URL to begin auto-detecting.
After download Octoparse and install on your device quickly, you can paste the site link you want to scrape and Octoparse will start auto-detecting.
Step 2: Customize the data field you want to extract.
A workflow will be created after auto-detection. You can easily change the data field according to your needs. There will be a Tips panel and you can follow the hints it gives.
Step 3: Download scraped website data into Excel
Run the task after you have checked all data fields. You can download the scraped data quickly in Excel/CVS formats to your local device, or save them to a database.
Except for auto-detecting, you can also scrape popular sites like Amazon, eBay, LinkedIn, Google Maps, etc. with preset templates. Just reading the tutorial of scraping data into Excel with Octoparse templates.
Video tutorial: extract web data to Excel efficiently
Web scraping project customer service
If time is your most valuable asset, and you want to focus on your core businesses, outsourcing such complicated work to a proficient web scraping team that has experience and expertise might be the best option. Data scraping is difficult to scrape data from websites due to the fact that the presence of anti-scraping bots will restrain the practice of web scraping. A proficient web scraping team would help you get data from websites properly and deliver structured data to you in an Excel sheet, or in any format you need.
Here are some customer stories that how Octoparse web scraping service helps businesses of all sizes.
Method 2: Get Web Data Using Excel Web Queries
Except for transforming data from a web page manually by copying and pasting, Excel Web Queries are used to quickly retrieve data from a standard web page into an Excel worksheet. It can automatically detect tables embedded in the web page’s HTML. Excel Web queries can also be used in situations where a standard ODBC (Open Database Connectivity) connection gets hard to create or maintain. You can directly scrape a table from any website using Excel Web Queries.
Here list the simple steps on how to extract website data with Excel web queries.
- Step 1. Go to Data > Get External Data > From Web.
- Step 2. A browser window named “New Web Query” will appear.
- Step 3. In the address bar, write the web address.
- Step 4. The page will load and will show yellow icons against data/tables.
- Step 5. Select the appropriate one.
- Step 6. Press the Import button.
Now you have the web data scraped into the Excel Worksheet – perfectly arranged in rows and columns as you like. Or you can check out from this link.
Method 3: Scrape Web Data with Excel VBA
Most of us would use formula’s in Excel (e.g. =avg(…), =sum(…), =if(…), etc.) a lot, but are less familiar with the built-in language – Visual Basic for Application a.k.a VBA. It’s commonly known as “Macros” and such Excel files are saved as a **.xlsm. Before using it, you need to first enable the Developer tab in the ribbon (right-click File -> Customize Ribbon -> check Developer tab). Then set up your layout. In this developer interface, you can write VBA code attached to various events. Click HERE to getting started with VBA in Excel 2010.
Using Excel VBA is going to be a bit technical – this is not very friendly for non-programmers among us. VBA works by running macros, step-by-step procedures written in Excel Visual Basic. To scrape data from websites to Excel using VBA, we need to build or get some VBA script to send some requests to web pages and get returned data from these web pages. It’s common to use VBA with XML HTTP and regular expressions to parse the web pages. For Windows, you can use VBA with Win HTTP or Internet Explorer to scrape data from websites to Excel.
With some patience and some practice, you would find it worthwhile to learn some Excel VBA code and some HTML knowledge to make your web scraping into Excel much easier and more efficient for automating the repetitive work. There’s a plentiful amount of material and forums for you to learn how to write VBA code.
Now, you have learned 3 different ways to pull data from any website to Excel. Choose the most suitable one according to your situation. However, Octoparse is always your choice if you don’t have coding skills or you want to save more time.