Epicareer Might not Working Properly
Learn More

Technical Architect - Big Data

$ 9,000 - $ 13,000 / month

Checking job availability...

Original
Simplified

Key Competencies

· Define reusable design patterns e.g. data movement (in/outbound) processes, data validation, data load & extraction processes etc.

· Create re-usable connections to different targets (DB/Hadoop etc.)

· Setup metadata repository for connections, schemas and contexts to be used across jobs

· Define / Develop foundational components - Error / Audit framework, Source acquisition ETL frame work, etc

· Evaluate and recommend the development methodologies, frameworks and modelling techniques.

· Provide technical leadership to the development team in preparing the design artifacts and implementation of integrations.

· Participate in design/code reviews and ensure that all solutions are aligned to pre-defined architectural specifications

· Optimize development through solution accelerators & value creators.

Experience

· Strong affinity for IT solutions and data architecture on macro as well as on micro level

· Conceptual brilliance with a track record for (big) data solutions incl. data modeling (structured/unstructured data)

· Sound experience in driving and delivering change

Beside data management and modeling, the architect will need to be familiar or have prior experience with other analytics area, such a dashboarding.

  • Hands on Experience in Design and Development on Big Data Platform (GCC or AWS or Azure etc.)
  • Hands on programming experience on Big Data (Python, PySpark, Spark Scala etc.)
  • Hand on Experience in cloud services (Kinesis, Kafka, Lambda, Databricks etc.)
  • Hands on experience on Storage covering Schema Layout, data modeling, partitioning, RDBMS, (Hive, HBase, MongoDB , Mysql, Sql, Casandra, S3, Dynamo DB, Redshift etc.)
  • Expertise in understanding client business and providing solution architecture in cloud covering Data Ingestion, Storage and Visualization

Key Competencies

· Define reusable design patterns e.g. data movement (in/outbound) processes, data validation, data load & extraction processes etc.

· Create re-usable connections to different targets (DB/Hadoop etc.)

· Setup metadata repository for connections, schemas and contexts to be used across jobs

· Define / Develop foundational components - Error / Audit framework, Source acquisition ETL frame work, etc

· Evaluate and recommend the development methodologies, frameworks and modelling techniques.

· Provide technical leadership to the development team in preparing the design artifacts and implementation of integrations.

· Participate in design/code reviews and ensure that all solutions are aligned to pre-defined architectural specifications

· Optimize development through solution accelerators & value creators.

Experience

· Strong affinity for IT solutions and data architecture on macro as well as on micro level

· Conceptual brilliance with a track record for (big) data solutions incl. data modeling (structured/unstructured data)

· Sound experience in driving and delivering change

Beside data management and modeling, the architect will need to be familiar or have prior experience with other analytics area, such a dashboarding.

  • Hands on Experience in Design and Development on Big Data Platform (GCC or AWS or Azure etc.)
  • Hands on programming experience on Big Data (Python, PySpark, Spark Scala etc.)
  • Hand on Experience in cloud services (Kinesis, Kafka, Lambda, Databricks etc.)
  • Hands on experience on Storage covering Schema Layout, data modeling, partitioning, RDBMS, (Hive, HBase, MongoDB , Mysql, Sql, Casandra, S3, Dynamo DB, Redshift etc.)
  • Expertise in understanding client business and providing solution architecture in cloud covering Data Ingestion, Storage and Visualization