Kafka Data Pipeline Engineer (Remote Contract)

Kafka Data Pipeline Engineer (Remote Contract)

Emploi Équateur


Titre du poste : Kafka Data Pipeline Engineer (Remote Contract)

Entreprise : Emma De Torre

Description du poste : I’m helping Century 21 Accent Homes find a top candidate to join their team full-time for the role of Kafka + Data Pipeline Engineer (Remote / Contract).
This is a cool job because you’ll get to build advanced messaging systems with Kafka & AI tech for a legacy-driven real estate leader since 1973.
Compensation:
USD 2.8K – 4K/month.
Location:
Remote (anywhere).
Mission of Century 21 Accent Homes:
« We are pro-actively promoting the interests of our clients through real estate target marketing, honest evaluation of property values, and open communications. »What makes you a strong candidate:
– You are proficient in KsqlDB, Node.js and Python.
– English – Conversational.
Responsibilities and more:Responsibilities:
– Design and implement Kafka topics for processing communication data from Gmail, LeadSimple, and other sources.
– Build a middleware API to:
– Trigger on incoming Gmail messages.
– Enrich messages using LeadSimple’s API.
(Optionally) perform Rent Manager lookups for property metadata.
– Filter out spam and irrelevant messages using logic such as sender whitelisting or LeadSimple contact validation.
– Maintain a conversation_context table using Tableflow or ksqlDB for enriched thread metadata.
– Ensure messages are properly routed to Kafka (communication-events) with clean, AI-ready JSON.
– Collaborate with internal teams to define message schemas, enrichment fields, and downstream AI use cases.
– Set up lightweight monitoring and logging for errors and failed enrichments.
– Advise on infrastructure best practices (e.g., using Redis for caching, managing Pub/Sub backpressure, etc.).
Requirements
– Proven experience working with Kafka (Confluent Cloud or self-hosted).
– Hands-on experience with:
– Kafka Streams or ksqlDB.
– REST API integrations (especially Gmail API and/or CRMs like LeadSimple).
– Proficiency in Python, Node.js, or similar backend languages.
– Familiarity with event-driven architecture and streaming design patterns.
– Experience with at least one cloud provider (AWS, GCP, or Azure).
– Solid understanding of asynchronous job handling, retry logic, and webhook workflows.
– Ability to structure clean, enriched JSON events for AI, analytics, and automation.
– Excellent communication skills to clearly explain technical concepts to ops-minded teams.
Nice-to-Have:
– Experience with OpenAI API, LangChain, or vector databases like Pinecone or Chroma.
– Experience building agent-style tools (like Slack bots or AI copilots).
– Prior exposure to property management systems or service scheduling tools.
– Experience with Tableflow or other streaming-state tools.

Salaire attendu :

Localisation : Cayambe, Pichincha

Date du poste : Tue, 29 Apr 2025 22:45:39 GMT

Postulez dès maintenant !