Job Title: Data & AI Senior Engineer
Singapore, SG
Seeking a skilled Data Engineer to lead Snowflake-based solutions supporting AI and Data initiatives. This role drives scalable infrastructure, legacy data warehouse migration, and robust ETL/ELT pipelines.The ideal candidate will have hands-on experience implementing Snowflake Data Cloud, strong ETL/ELT capabilities, and a proven ability to collaborate with business stakeholders. The engineer ensures secure, high-performance architecture aligned with AI workflows, while enforcing governance, quality, and seamless integration across platforms.
Design Responsibilities:
- Analyse internally and externally sourced raw data to generate BI and Advanced Analytics datasets based on stakeholder requirements.
- Design scalable data pipelines to curate sourced data into the in-house data warehouse.
- Develop data marts to facilitate dataset consumption by business and IT stakeholders.
- Propose data model changes that align with in-house data warehouse standards.\
- Define and execute migration activities from legacy databases to the Snowflake-based data warehouse.
- Implement and manage Snowflake governance (access control, data security, usage monitoring).
- Support AI use cases through data preparation and integration.
- Collaborate with cross-functional teams to deliver data-driven solutions.
Engineering Responsibilities:
- Data Pipeline Development: Design and implement ETL/ELT processes for AI and ML models.
- Infrastructure Management: Select and manage cloud-based data storage solutions.
- Model Deployment Support: Prepare datasets and environments for training and deploying AI models.
- Real-Time Analytics: Handle unstructured data and support real-time data processing.
- AI-Specific Tools: Work with vector databases, LLM pipelines, and frameworks like TensorFlow or PyTorch.
Collaboration & Governance
- Collaborate with data scientists, ML engineers, and business stakeholders.
- Ensure data governance, security, and compliance.
- Monitor and optimise AI model performance.
REQUIREMENTS:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related quantitative discipline.
- 4–5 years of hands-on experience implementing Snowflake data cloud in a production environment.
- Strong expertise in ETL/ELT tools and frameworks (e.g., AWS Glue, dbt, Talend, Informatica).
- Experience with MLOps tools like MLflow, Docker, and LangChain.
- Proven experience in data warehouse migration and cloud data architecture.
- Solid understanding of data modelling, SQL, and BI/Analytics concepts.
- Experience working closely with business teams to deliver data solutions aligned with strategic goals.
- Familiarity with Snowflake governance best practices.
- Certified in SnowPro Data Engineer.
- Deep knowledge of Snowflake performance optimisation, governance, and security.
- Experience with cloud technologies such as AWS RDS, AWS Fargate, AWS S3 and Azure .
- Familiarity with PostgreSQL, MS SQL, and other relational databases.
- Strong programming skills in Python and/or Java.
- Understanding of LLMs, RAG pipelines, and generative AI deployment..
- Strong problem-solving and analytical thinking.
- Excellent communication and stakeholder engagement skills.
- Ability to work independently and manage multiple priorities.
- Proactive mindset with a focus on continuous improvement.
Job Segment:
Database, Data Warehouse, Computer Science, SQL, Java, Technology