Job Summary
We are looking to hire a Data Engineer specializing in data lakes, data warehouses, and ETL, you will be responsible for designing, implementing, and maintaining our data infrastructure. You will work closely with data scientists, analysts, and other stakeholders to ensure seamless data flow, high-quality data, and accessibility for analytical and operational use case
- Minimum Qualification : Degree
- Experience Level : Mid level
- Experience Length : 3 years
Job Description/Requirements
- Design, build, and maintain scalable data lakes and data warehouse architectures to store structured and unstructured data.
- Develop and manage ETL (Extract, Transform, Load) processes to ingest data from various sources into the data lake and data warehouse.
- Ensure data quality, data governance, and data security practices are implemented and maintained.
- Collaborate with data scientists and analysts to understand data requirements and provide solutions for data access and analysis.
- Optimize data storage and retrieval performance.
- Monitor and troubleshoot data infrastructure issues, ensuring high availability and reliability.
- Implement and maintain data catalog and metadata management tools.
- Stay updated with the latest trends and technologies in data engineering, data lakes, and data warehouses.
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 3+ years of experience in data engineering or a similar role.
- Strong experience with data lake technologies such as AWS S3, Azure Data Lake, Google Cloud Storage, or similar.
- Proficiency in ETL tools and processes (e.g., AWS Glue, Apache NiFi, Talend).
- Experience with big data processing frameworks like Apache Spark or Hadoop.
- Knowledge of data warehousing concepts and technologies (e.g., Amazon Redshift, Google BigQuery, Snowflake).
- Experience with SQL and NoSQL databases.
- Familiarity with data governance and data security best practices.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with containerization and orchestration tools like Docker and Kubernetes.
- Knowledge of data catalog and metadata management tools (e.g., AWS Glue Data Catalog, Apache Atlas).
- Experience with data visualization tools and techniques.
- Relevant certifications in data engineering or cloud platforms.
Important Safety Tips
- Do not make any payment without confirming with the Jobberman Customer Support Team.
- If you think this advert is not genuine, please report it via the Report Job link below.