All Collections
Octoparse 101
Lesson 6: Schedule regular runs
Lesson 6: Schedule regular runs
Updated over a week ago

By now, you've finished building your first scraping task and know how to run the task to get your target data. Let's take it to the next level and find out how you can make your daily scraping routines more effective and efficient with the below features:


1. Task scheduling

If you are planning on getting data extracted regularly, task scheduling is exactly what you need and can save you a lot of time. You can schedule your task to run once, on a recurring schedule, or even run repeatedly, such as every 1 min, 5 mins, 10 mins, or 30 mins.

STEP 1.

Find your task on the Dashboard, and click Not Set under the Next Run column. The first one is for Cloud schedules and the second one is for local schedules.

set_schedule.jpg

STEP 2.

Choose how often you would like to run the task.

mceclip0.png

STEP 3.

For recurring crawls, select the day of the week/day of the month and the time of the day to run your task.

monthly.png
weekly1.png

For repeating crawls, select the desired time interval.

interval.png

STEP 4.

You can also save the settings for later use. Give the settings a name and click Save. This way, you can always select the saved schedule setting and apply it directly to any other task.

saveschedule.gif

STEP 5.

After everything's done, click Schedule ON to start running the task on the schedule right away.

schedule_on.jpg

STEP 6.

When a task is scheduled, you'll see the next run time on the Dashboard.

next_run_time.jpg

You can easily turn it ON and OFF by clicking the next run time on the Dashboard, there you can choose Schedule ON or Schedule OFF.


2. Auto-data export (for Cloud data)

Data export to the database can also be automated and scheduled. If you need to export data to your databases regularly, data export scheduling can save you tons of work.

STEP 1.

Load the cloud data for your task

STEP 2.

Click on Export Data

exportdata3.png

STEP 3.

Click open Auto-export to the database, then select the type of database you have.

database.png

STEP 4.

Complete the information to connect to your database. Click Test connection to test if the database is connected successfully. Then, click Next to proceed to map the data fields, and choose the desired time interval for the export.

mceclip0__1_.png

STEP 5.

Lastly, click Next to finish the process.

sqldone.png

STEP 6.

You can find your auto-export tasks in the Database Auto-export Tool

tool.jpg

3. Connect via API

With the Octoparse API, you can run scraping tasks, retrieve extracted data, and even edit your tasks programmatically by coordinating with your application.

Check the API documentation for details.


4. Connect via Zapier

You can connect the data scraped with other applications (e.g., Google Drive, Google Sheet, Dropbox) easily without coding by using Zapier. Check this to find out more details: How to Connect Octoparse with Zapier


Did this answer your question?