Job Description
We at NETSOL Cloud Services dept are looking for experienced Big data engineer with solid data engineering, data lake, ETL, DWH concepts. You will build data solutions using state of the art technologies to acquire, ingest and transform big data and publish into DWH following best practices at each ETL stage.
Responsibilities:
- Design and develop data applications on AWS using big data technologies to ingest, process, and analyze large disparate datasets.
- Build robust data pipelines on Cloud using Airflow, Spark/EMR, Kinesis/Kafka, AWS Glue, Lambda or other AWS technologies.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from various data sources using SQL and AWS big data technologies.
Qualifications:
- 3+ years of Experience delivering Data lake and data warehousing projects in AWSETL experience and understanding SQL (Joins, Window Functions, CTEs)EMR / Spark Redshift ETL tool – Airflow, AWS Glue etc Python / Scala Understanding of dimensional modelling, Facts & dimensions tables
Nice to Have:
Snowflake Kinesis/Kafka Databricks DBT Lambda AI/ML