Checking job availability...
Original
Simplified
- Using the Talend ETL toolset, to create new and maintain existing ETL jobs.
- Design and implement ETL for extracting and transforming data from diverse sources, such as: Cloudera, PostgreSQL, and SQL Server databases.
- Design and develop database tables necessary along with the necessary constraints as per the requirement.
- Collaborate with Team members to understand source system structures and data retrieval methods/techniques, and tools within the organization.
- Support the development of data transformation logic using ETL tools or scripting languages like SQL, Python, etc.
- Clean, validate, and transform data to conform to target schema and quality standards.
- Work with the Team to execute data quality improvement plans.
- Participate in troubleshooting activities to maintain data integrity and process efficiency.
- Degree in computer science, mathematics, or engineering
- At least 4-5 years of relevant working experiences
- Experience with Talend, Python and Spark
- Should have good knowledge and working experience in Database and Hadoop (Hive, Impala, HDFS).
- Understanding of data-warehousing and data-modeling techniques
- Knowledge of industry-wide visualization and analytics tools