Job Description
Arbisoft is looking for an experienced AI Data Engineer. The role focuses on building internal services and user-facing products that leverage AI, with core responsibilities in application development, GCP cloud operations, and service integration. An ideal candidate will collaborate with product, data engineering, and platform teams to translate business needs into technical solutions, from prototypes to production services. Key work includes developing APIs (Python, Node), managing cloud infrastructure (GCP, Cloud Run, BigQuery), and creating ETL/ELT pipelines to support current and future data and AI products.
Key Responsibilities
- Design and develop robust backend services and APIs that power audio and data-driven products, with clean interfaces and scalable architectures.
- Build and maintain ETL/ELT workflows and data pipelines to support real-time user applications.
- Implement service endpoints and backend components, including Python-based microservices, function-based workloads, and distributed applications.
- Develop reliable, production-ready cloud services on GCP , deploying with services like Cloud Run, Cloud Run Functions, GKE, etc.
- Create reusable modules and shared libraries to help organize and streamline development across the Data and Commercialization team.
- Prototype new AI product capabilities and transform validated concepts into maintainable production services.
Qualifications
- Strong, hands-on experience deploying services on Google Cloud Platform with tools like Cloud Run, GKE, Pub/Sub, Dataflow, among others.
- Hands-on experience with GCP AI services like the Vertex AI suite is a plus.
- Strong experience using modern relational and NoSQL database solutions, with proficiency in data manipulation languages like SQL and Python.
- Our solutions span across multiple storage technologies, including BigQuery, Firestore, Postgres, and more.
- Experience developing robust, maintainable REST APIs and an understanding of scalable service architecture and backend development best practices.
- Familiarity with GCP networking concepts that ensure secure frameworks for the deployment of data products.
- Comfort working through ambiguity, collaborating effectively, iterating quickly, and shaping early-stage service designs.
- Driven mindset, ready to establish secure and efficient patterns that will streamline the development process of a team at the cutting edge of AI and data.