Updating External System Product Catalog
Updating External System Product Catalog#
Use Case Description#
This document outlines how an external system website can update its product catalog, including pricing and discounts, through API services. The process covers two primary use cases:1.
Initial Setup: Full catalog and product import.
2.
Subsequent Updates: Incremental updates to the catalog and product details.
In both scenarios, data is fetched in a paginated format via the respective API services. The external system platform processes this data and updates its internal systems accordingly.Actors#
External System: The website that wants to update its product catalog and prices.
4ws.trade: Provides access to product and catalog data through the provided API services.
Product and Catalog Services: Backend systems that store and provide catalog and product data.
Preconditions#
The external system must have the necessary permissions and access to the document creation service, please refer to Authentication. The catalog code must be known for updates, or the system must retrieve the catalog list first.
For updates, the system must store the last successful API invocation time to ensure accurate incremental updates.
APIs Used#
1.
CATALOG_EXPORT: Retrieves catalog details.
PRODUCTS_CATALOG_EXPORT: Retrieves product details from the catalog.
2.
Get products Catalog: Retrieves updated products based on the catalog code and a specified fromDatetime. Get product prices: Retrieves pricing information for a specific product (sale price and discount). Main Success Scenario#
1.
The external system system calls Get bulk CSV Export for CATALOG_EXPORT to obtain catalog details. Once the catalog is known, it calls Get bulk CSV Export for PRODUCTS_CATALOG_EXPORT to retrieve the list of products within the catalog. The website processes the data, updating its internal product catalog and prices.
Some services are paginated to handle large sets of data efficiently.
For detailed information on how to handle pagination, refer to our Paginated Responses Alternate Flows#
Catalog Code Not Known: If the catalog code is not known, the system must first call Get Catalog details by Code to get the list of available catalogs. Error Handling: If a request fails, the system should retry the request and apply error handling mechanisms to prevent partial updates.
Sequence Diagram#
Code Examples in Node.js#
Fetching Catalog Data and Handling Pagination#
This example demonstrates how to handle pagination and subsequent API calls. The moreRows attribute indicates whether additional pages of data are available.Best Practices for CSV Processing#
1.
Use streaming to process large CSV files efficiently. Libraries like csv-parser in Node.js allow you to process rows one at a time without loading the entire file into memory.
Implement chunking mechanisms if the file size is too large to process in one go (it is recommended to use border table).
2.
Unreadable Files: Implement retry logic for failed file downloads or unreadable files. Log errors for audit and debugging purposes.
Invalid Rows: If a specific row in the CSV is invalid, log the error but continue processing the remaining rows. Depending on the severity of the error, you may want to skip that row or mark it for manual review.
3.
Retries: Implement retry mechanisms with exponential backoff for failed API requests or file downloads.
Data Consistency: Ensure that updates are applied transactionally. If part of the file is processed, but an error occurs, either roll back the changes or store the partial progress to resume processing later.
Logging and Monitoring: Track the progress of CSV processing and ensure that any issues (file corruption, invalid data) are logged and alerted to the system administrators.
4.
Always store the timestamp of the last successful API call. This timestamp is crucial for the fromDatetime parameter in subsequent requests to retrieve only updated data.
Ensure that the fromDatetime is correctly updated even if some products fail to process.
5.
Implement robust logging for both success and failure states. Use monitoring tools to alert administrators to potential issues in file processing or data updates.
Use metrics like file size, processing time, and error rates to identify any performance bottlenecks.
By following these best practices, your external system platform can efficiently integrate with the API and manage large-scale data updates without performance issues or data inconsistencies.Modified at 2025-10-14 12:49:30