Epicareer Might not Working Properly
Learn More

Data Engineer

$ 7,000 - $ 10,000 / month

Checking job availability...

Original
Simplified

Key Responsibilities:

  • Data Infrastructure Development: Design and build scalable, efficient, and secure data infrastructure to support advanced analytics and reporting systems.
  • ETL Pipeline Development: Develop and maintain robust ETL (Extract, Transform, Load) pipelines to process data from various sources, ensuring high-quality, consistent, and accurate data.
  • Data Integration: Integrate multiple internal and external data sources, ensuring seamless data flow across platforms.
  • Data Modeling: Create and maintain data models and schemas that ensure the efficient storage and retrieval of data.
  • Optimization and Performance Tuning: Enhance the performance and scalability of data pipelines, optimizing for speed and reliability.
  • Collaboration: Work closely with data scientists, business analysts, and other teams to ensure that data is organized and accessible for analysis, reporting, and machine learning applications.
  • Data Governance: Implement data quality control processes to maintain data integrity and consistency.
  • Automation: Automate data workflows and systems to ensure efficiency and reduce manual work.
  • Troubleshooting and Maintenance: Monitor the health of data systems, diagnose issues, and maintain optimal data system performance.

Qualifications:

  • Education: Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent professional experience).
  • Experience:2+ years of experience in data engineering, database management, or a similar field.
    Experience working with cloud platforms (AWS, Google Cloud, or Azure).
    Proficiency in relational and NoSQL databases (MySQL, PostgreSQL, MongoDB, Cassandra).
    Solid experience with ETL processes and tools (e.g., Apache Airflow, Talend, or custom-built pipelines).

Key Responsibilities:

  • Data Infrastructure Development: Design and build scalable, efficient, and secure data infrastructure to support advanced analytics and reporting systems.
  • ETL Pipeline Development: Develop and maintain robust ETL (Extract, Transform, Load) pipelines to process data from various sources, ensuring high-quality, consistent, and accurate data.
  • Data Integration: Integrate multiple internal and external data sources, ensuring seamless data flow across platforms.
  • Data Modeling: Create and maintain data models and schemas that ensure the efficient storage and retrieval of data.
  • Optimization and Performance Tuning: Enhance the performance and scalability of data pipelines, optimizing for speed and reliability.
  • Collaboration: Work closely with data scientists, business analysts, and other teams to ensure that data is organized and accessible for analysis, reporting, and machine learning applications.
  • Data Governance: Implement data quality control processes to maintain data integrity and consistency.
  • Automation: Automate data workflows and systems to ensure efficiency and reduce manual work.
  • Troubleshooting and Maintenance: Monitor the health of data systems, diagnose issues, and maintain optimal data system performance.

Qualifications:

  • Education: Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent professional experience).
  • Experience:2+ years of experience in data engineering, database management, or a similar field.
    Experience working with cloud platforms (AWS, Google Cloud, or Azure).
    Proficiency in relational and NoSQL databases (MySQL, PostgreSQL, MongoDB, Cassandra).
    Solid experience with ETL processes and tools (e.g., Apache Airflow, Talend, or custom-built pipelines).