Date: 12/09/2024
Develop ETL jobs using AWS Glue using PySpark as per business requirements and writes them into S3 and Redshift. Develop AWS Glue jobs and writes them into Redshift tables. Ingest data from On-prem to AWS using Spark jobs and configuration files. Develop ETL jobs using DBT (Data Build Tool) and writes data into Snowflake. Follow RAD release methodology. Build the AWS Cloud infrastructure using HashiCorp Terraform. Create stored procedures to ingest data into Redshift. Implement SCD0, SCD1, and SCD2 logic to ingest client’s, advisors’, and financial related data when there is a change in the source data. Create PySpark jobs to migrate historical data from Redshift to RDS Postgres. Perform table level migration from Redshift to RDS Postgres. Create Glue Workflows to run Glue jobs and crawlers sequentially. Skills:DBT (Data Build Tool), Snowflake, Terraform, AWS Glue, Amazon S3, PySpark, and Amazon Redshift.
Requirements: Bachelor’s Degree in Engineering/ Science/ Business Administration w/ 5 yr of experience.
Contact: Email resume to ravi@senainfotech.com or mail it to President, Sena Info Technologies, 1345 Monroe Ave, Ste 241, Grand Rapids, MI 49505. Mention job title in subject line.