Epicareer Might not Working Properly
Learn More

Executive Systems Analyst (AI Application Data Engineer)

Salary undisclosed

Checking job availability...

Original
Simplified
What the role is:Azure OpenAI services, text-embedding-ada-002, gpt-4o model, embedding model, huggingface, llama3, AWS bedrock, model farm, app service, web app, function app, postgreSQL, storage account, container registry, application gateway, key vault, terraform scripts, retrieval augmented generation, fine-tuning, graphics processing unit If you know what we are talking about, you are the one we are looking for!What you will be working on:We are seeking a skilled and motivated AI Application Data Engineer to join our team. In this role, you will assist or lead the review, design, implementation, recommendation and management of existing or proposed large language models (LLM) for either cloud-based or on-premises systems. You will provide support and technical governance, offering expertise to AI solutions for cloud or on-premises architecture, including deployment and monitoring of costs/operations of the AI solutions. Your primary responsibilities will include: Role 1: Design, Setup and Implement on-premises LLM infrastructure • Design and execute strategies including communications, review sessions for on-premises LLM setup and potentially to train a basic model in cloud services before bringing back to on-premises setup to do testing. • Collaborate closely with other key stakeholders (AI team, infrastructure, database administrators, vendors etc) and users (e.g. corporate group or planners/architects) to ensure the requirements and setup align with strategic AI objectives. • Facilitate staff adoption of these initiatives and ensure a smooth transition from current process to the recommended setup. • Conduct horizon scanning for WOG AI initiatives on a quarterly basis and present the findings and recommendations to the team and Management. • Takes part in the design, implementation, deployment and administration of the AI solutions to support on-premises systems. • Knowledge of on-premises embedding model, open source LLM models e.g. llama3, containers setup, model farm, finetuning models, PostgreSQL databases, storage account, terraform scripting are preferred Role 2: Understand, Recommend and Redevelop LLM models for Cloud Environment • Understand existing LLM models, recommend, design, redevelop or replace the current model and implement LLM solution for case processing systems • Collaborate closely with other key stakeholders (AI team, infrastructure, database administrators, vendors etc) and users (e.g. corporate group or planners/architects) to ensure the requirements and setup align with strategic AI objectives. • Facilitate staff adoption of these initiatives and ensure a smooth transition from current process to the recommended setup. • Takes part in the design, implementation, deployment and administration of the AI solutions on cloud environment. • Knowledge of Azure OpenAI services, text-embedding-ada-002, gpt-4o model, embedding model, PostgreSQL databases, storage account, terraform scripting are preferredWhat we are looking for:
  • Systematic and skilled in synthesizing trends and insights to design, recommend and implement AI solutions for on-premises and/or cloud infrastructure • Bachelor's degree in data science, data analytics, computer science or related disciplines. • Minimum 3 years of working experience in architecting, designing, developing and implementing AI solutions in cloud (Microsoft Azure, Amazon Web Service (AWS)) or on-premises infrastructure • Technical experience in at least two of the areas: o Hands on experience with Azure OpenAI / AWS Bedrock services o Hands on experience with large language models (LLMs) and embedding models o Hands on experience in building IaC scripts using Terraform o Hands on experience with on-premises environment with Redhat Linus OS o Hands on experience with cloud environment (Microsoft Azure or AWS services) on app services (e.g. web functions or function apps) o Hands on experience with container registry o Hands on experience in setting up the schemas for PostgreSQL database o Hands on experience in fine-tuning LLM models with pre-defined datasets • Experience in multi-cloud environments would be considered positively • Good presentation and writing skills • Organized and skilled in synthesizing trends and insights and a team player
About Urban Redevelopment Authority:The Urban Redevelopment Authority (URA) is Singapore's national land use planning authority. URA prepares long term strategic plans, as well as detailed local area plans, for physical development, and then co-ordinates and guides efforts to bring these plans to reality. Prudent land use planning has enabled Singapore to enjoy strong economic growth and social cohesion, and ensures that sufficient land is safeguarded to support continued economic progress and future development.
What the role is:Azure OpenAI services, text-embedding-ada-002, gpt-4o model, embedding model, huggingface, llama3, AWS bedrock, model farm, app service, web app, function app, postgreSQL, storage account, container registry, application gateway, key vault, terraform scripts, retrieval augmented generation, fine-tuning, graphics processing unit If you know what we are talking about, you are the one we are looking for!What you will be working on:We are seeking a skilled and motivated AI Application Data Engineer to join our team. In this role, you will assist or lead the review, design, implementation, recommendation and management of existing or proposed large language models (LLM) for either cloud-based or on-premises systems. You will provide support and technical governance, offering expertise to AI solutions for cloud or on-premises architecture, including deployment and monitoring of costs/operations of the AI solutions. Your primary responsibilities will include: Role 1: Design, Setup and Implement on-premises LLM infrastructure • Design and execute strategies including communications, review sessions for on-premises LLM setup and potentially to train a basic model in cloud services before bringing back to on-premises setup to do testing. • Collaborate closely with other key stakeholders (AI team, infrastructure, database administrators, vendors etc) and users (e.g. corporate group or planners/architects) to ensure the requirements and setup align with strategic AI objectives. • Facilitate staff adoption of these initiatives and ensure a smooth transition from current process to the recommended setup. • Conduct horizon scanning for WOG AI initiatives on a quarterly basis and present the findings and recommendations to the team and Management. • Takes part in the design, implementation, deployment and administration of the AI solutions to support on-premises systems. • Knowledge of on-premises embedding model, open source LLM models e.g. llama3, containers setup, model farm, finetuning models, PostgreSQL databases, storage account, terraform scripting are preferred Role 2: Understand, Recommend and Redevelop LLM models for Cloud Environment • Understand existing LLM models, recommend, design, redevelop or replace the current model and implement LLM solution for case processing systems • Collaborate closely with other key stakeholders (AI team, infrastructure, database administrators, vendors etc) and users (e.g. corporate group or planners/architects) to ensure the requirements and setup align with strategic AI objectives. • Facilitate staff adoption of these initiatives and ensure a smooth transition from current process to the recommended setup. • Takes part in the design, implementation, deployment and administration of the AI solutions on cloud environment. • Knowledge of Azure OpenAI services, text-embedding-ada-002, gpt-4o model, embedding model, PostgreSQL databases, storage account, terraform scripting are preferredWhat we are looking for:
  • Systematic and skilled in synthesizing trends and insights to design, recommend and implement AI solutions for on-premises and/or cloud infrastructure • Bachelor's degree in data science, data analytics, computer science or related disciplines. • Minimum 3 years of working experience in architecting, designing, developing and implementing AI solutions in cloud (Microsoft Azure, Amazon Web Service (AWS)) or on-premises infrastructure • Technical experience in at least two of the areas: o Hands on experience with Azure OpenAI / AWS Bedrock services o Hands on experience with large language models (LLMs) and embedding models o Hands on experience in building IaC scripts using Terraform o Hands on experience with on-premises environment with Redhat Linus OS o Hands on experience with cloud environment (Microsoft Azure or AWS services) on app services (e.g. web functions or function apps) o Hands on experience with container registry o Hands on experience in setting up the schemas for PostgreSQL database o Hands on experience in fine-tuning LLM models with pre-defined datasets • Experience in multi-cloud environments would be considered positively • Good presentation and writing skills • Organized and skilled in synthesizing trends and insights and a team player
About Urban Redevelopment Authority:The Urban Redevelopment Authority (URA) is Singapore's national land use planning authority. URA prepares long term strategic plans, as well as detailed local area plans, for physical development, and then co-ordinates and guides efforts to bring these plans to reality. Prudent land use planning has enabled Singapore to enjoy strong economic growth and social cohesion, and ensures that sufficient land is safeguarded to support continued economic progress and future development.