Skip to main content
Octoparse API FAQs
Scarlett avatar
Written by Scarlett
Updated over 6 months ago

Q1

Is it possible to utilize the API for exporting Local Extraction data?

A1

Unfortunately, that is not an option. However, you can first back up your local data to the Cloud and then utilize the API to export the data.


Q2

What is the reason for receiving the "403 Not Authorized" or "Permission denied" error?

A2

The occurrence of the "403 Not Authorized" or "Permission denied" error is due to a lack of authorization when attempting to access an API. This error is typically encountered when a request is made for a task that requires a Professional account, while the account in question is on the Standard Plan. Prior to submitting a request, it is advisable to verify the API permissions.


Q3

What is the duration of validity for the "refresh token"?

A3

The "Refresh token" remains valid for a period of 15 days.


Q4

What is the cause of the error "429 Limit Times"?

A4

Octoparse restricts the number of API requests to 20 per second. If you encounter a status code of '429', kindly decrease the frequency of your requests.


Q5

Why is it consistently the case that identical data is retrieved when utilizing the Export non-exported data feature in the API?

A5

Please utilize the Mark as exported function to designate your unexported data before exporting the subsequent batch of data.


Q6

Why is it that the same batch of data is consistently retrieved when utilizing the export data by offset function?

A6

It is necessary to utilize the offset provided in the API response to retrieve the subsequent set of data rows. For instance, if the offset returned is 100, but you are currently using offset=0 or offset=10, the data rows obtained will remain unchanged.


Q7

How may I obtain the subsequent 1,000 rows through the Export non-exported data API when the quantity of my data exceeds 1,000 rows?

A7

If the initial batch of non-exported data has been exported, kindly utilize the Mark as exported function to indicate the completion of this batch. Subsequently, the next 1,000 rows can be obtained by employing the Export non-exported data feature, and so on.


Q8

Why am I limited to exporting only 1,000 rows when utilizing the Get data by offset function? Is there a way to retrieve all of the rows?

A8

Each API request is limited to exporting a maximum of 1,000 rows. To obtain all the data, multiple API requests must be utilized. For the initial request, an offset value of 0 should be used to retrieve the first 1,000 rows. Subsequent requests should use an offset value of 1000 (which may exceed 1,000 and can be obtained from the response of the first request) to retrieve the next 1000 rows, and so on.


Q9

Why is the quantity of offset returns significantly high? I did not request such a large number of data lines.

A9

This is due to the deletion of certain data rows from previous runs, resulting in a non-zero starting offset. For instance, if the initial execution of the task yielded 100 data rows, these would have been assigned an offset of 0. However, upon the removal of this batch of 100 data rows, subsequent execution of the task would generate 100 new data rows, with a starting offset of 100, as the data rows with offsets 0-100 have been eliminated.


Q10

Is it possible to programmatically generate additional crawlers using the API and subsequently extract them using the same API?

A10

Currently, the Octoparse API solely permits the retrieval of extracted data, acquisition of task details, and management of tasks/crawlers (initiating/terminating extraction). Whenever the need arises to establish or adjust a crawler using Octoparse, it must be done through the task configuration interface within the application.

Did this answer your question?