eu.api.smith.langchain.com
.backend
and queue
services require write access to the destination bucket:backend
service attempts to write a test file to the destination bucket when the export destination is created.
It will delete the test file if it has permission to do so (delete access is optional).queue
service is responsible for bulk export execution and uploading the files to the bucket.id
to reference this destination in subsequent bulk export operations.
If you receive an error while creating a destination, see debug destination errors for details on how to debug this.
0.10.34
(application version >= 0.10.91
)access_key_id
and secret_access_key
:
credentials.session_token
key when creating the bulk export destination.credentials
key from the request when creating the bulk export destination.
In this case, the standard Boto3 credentials locations will be checked in the order defined by the library.endpoint_url
and supply the region that matches the region of your bucket.
endpoint_url
which is typically https://storage.googleapis.com
.
Here is an example of the API request when using the GCS XML API which is compatible with S3:
session_id
is also known as the Tracing Project ID, which can be copied from the individual project view by clicking into the project in the Tracing Projects list.id
to reference this export in subsequent bulk export operations.
0.10.42
(application version >= 0.10.109
)interval_hours
and remove end_time
:
interval_hours
must be between 1 hour and 168 hours (1 week) inclusive.start_time=(scheduled_export_start_time), end_time=(start_time + interval_hours)
.
Then start_time=(previous_export_end_time), end_time=(this_export_start_time + interval_hours)
, and so on.end_time
must be omitted for scheduled exports. end_time
is still required for non-scheduled exports.source_bulk_export_id
attribute filled.end_time + 10 minutes
to account for any runs that are submitted with end_time
in the recent past.start_time=2025-07-16T00:00:00Z
and interval_hours=6
:
Export | Start Time | End Time | Runs At |
---|---|---|---|
1 | 2025-07-16T00:00:00Z | 2025-07-16T06:00:00Z | 2025-07-16T06:10:00Z |
2 | 2025-07-16T06:00:00Z | 2025-07-16T12:00:00Z | 2025-07-16T12:10:00Z |
3 | 2025-07-16T12:00:00Z | 2025-07-16T18:00:00Z | 2025-07-16T18:10:00Z |
{export_id}
with the ID of the export you want to monitor. This command retrieves the current status of the specified export job.
{export_id}
with the ID of the export you wish to cancel. Note that a job cannot be restarted once it has been cancelled,
you will need to create a new export job instead.
--endpoint-url
option.
For GCS, the endpoint_url
is typically https://storage.googleapis.com
:
errors
field of the run.
Error | Description |
---|---|
Access denied | The blob store credentials or bucket are not valid. This error occurs when the provided access key and secret key combination doesn’t have the necessary permissions to access the specified bucket or perform the required operations. |
Bucket is not valid | The specified blob store bucket is not valid. This error is thrown when the bucket doesn’t exist or there is not enough access to perform writes on the bucket. |
Key ID you provided does not exist | The blob store credentials provided are not valid. This error occurs when the access key ID used for authentication is not a valid key. |
Invalid endpoint | The endpoint_url provided is invalid. This error is raised when the specified endpoint is an invalid endpoint. Only S3 compatible endpoints are supported, for example https://storage.googleapis.com for GCS, https://play.min.io for minio, etc. If using AWS, you should omit the endpoint_url. |