Or your alerts
New
2 days ago

DevOps Engineer (Data Engineering and Pipeline Management)

KDNPLUS

Engineering & Technology

Entertainment, Events & Sport NGN 150,000 - 250,000
Easy Apply

Job Summary

We are looking for a talented DevOps Engineer with a strong background in data engineering and pipeline management. The successful candidate will play a critical role in designing, implementing, and maintaining data pipelines that are crucial for our data-centric projects.

  • Minimum Qualification : Degree
  • Experience Level : Mid level
  • Experience Length : 4 years

Job Description/Requirements

Responsibilities:

  • Design, build, and optimize scalable data pipelines for ETL processes.
  • Ensure high availability and reliability of data flow across systems.
  • Implement Infrastructure as Code (IaC) using tools like Terraform or Ansible.
  • Automate deployment processes and system configurations.
  • Set up and manage CI/CD pipelines using tools such as Jenkins, GitLab CI/CD, or CircleCI.
  • Collaborate with development teams to streamline code integration and deployment.
  • Deploy and manage applications on cloud platforms like AWS, Azure, or Google Cloud.
  • Optimize cloud resources for performance and cost-efficiency.
  • Implement monitoring solutions using tools like Prometheus, Grafana, or ELK Stack.
  • Proactively identify and resolve issues to maintain system health.
  • Ensure data security protocols are in place and comply with industry regulations.
  • Conduct regular security assessments and audits.
  • Work closely with cross-functional teams including data scientists, software engineers, and product managers.
  • Participate in architectural discussions and contribute to technical decision-making.


Requirements:

  • Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
  • Minimum of 3 years of experience in DevOps, Data Engineering, or a similar role.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Experience with big data technologies like Hadoop, Spark, or Kafka.
  • Strong knowledge of containerization tools like Docker and orchestration platforms like Kubernetes.
  • Familiarity with database systems, both SQL and NoSQL.
  • Experience with version control systems like Git.
  • Excellent problem-solving and analytical abilities.
  • Strong communication and collaboration skills.
  • Ability to work independently and within a team.
  • Adaptability to fast-paced environments and changing requirements.

Important Safety Tips

  • Do not make any payment without confirming with the Jobberman Customer Support Team.
  • If you think this advert is not genuine, please report it via the Report Job link below.
Report Job

Share Job Post

Lorem ipsum dolor (Location) Lorem ipsum Confidential

Job Function : Lorem ipsum

2 years ago

Lorem ipsum dolor (Location) Lorem ipsum Confidential

Job Function : Lorem ipsum

2 years ago

Lorem ipsum dolor (Location) Lorem ipsum Confidential

Job Function : Lorem ipsum

2 years ago

Stay Updated

Join our newsletter and get the latest job listings and career insights delivered straight to your inbox.

We care about the protection of your data. Read our privacy policy.

This action will pause all job alerts. Are you sure?

Cancel Proceed
Report Job
Please fill out the form below and let us know more.
Share Job Via Sms

Preview CV