Enterprise Data Integration: Oracle/AS/400 → PostgreSQL, Cassandra (Batch or Kafka Streaming)
Need to modernize data trapped in Oracle, IBM AS/400, or flat files? I build robust, scalable data pipelines — both batch (ETL) and real-time (Kafka) — to move your data into PostgreSQL, Apache Cassandra, or other modern systems.
What I Deliver:
- Batch ETL: Using Talend or Python + Dask for high-volume, parallel processing.
- Real-time Streaming: Capture changes or events and publish to Apache Kafka topics for instant consumption.
- Data validation, transformation, error handling, and clear runbook documentation.
- Optional: Node.js API or Flutter dashboard to interact with your data.
Ideal For:
- Replacing legacy reporting with modern analytics.
- Feeding operational data into microservices or cloud apps.
- Building event-driven architectures with Kafka.
Why Me?
- Daily hands-on work with Oracle, AS/400, Talend, Kafka, Cassandra, and PostgreSQL in production.
- I build engineer-grade solutions, not quick scripts.
- Responsive communication in English or Thai (GMT+7).
Pricing:
- Fixed-price projects from £100–£300 (based on scope).
- Or £20/hour for custom or ongoing pipeline work.
Next Step:
Tell me:
- Your source system (e.g., AS/400 file, Oracle table)
- Your target (e.g., PostgreSQL, Kafka topic, Cassandra)
- Batch or real-time?
- Approx. data volume
I’ll send a tailored solution within 12 hours.