Share this Job

Big Data DevOps Engineer - Digital

Publish Date:  Dec 24, 2020

Warszawa, Warszawa, PL

Company:  Atos

About Atos

Atos is a global leader in digital transformation with over 110,000 employees in 73 countries and annual revenue of over € 11 billion. European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index.

The purpose of Atos is to help design the future of the information technology space. Its expertise and services support the development of knowledge, education as well as multicultural and pluralistic approaches to research that contribute to scientific and technological excellence. Across the world, the group enables its customers, employees and collaborators, and members of societies at large to live, work and develop sustainably and confidently in the information technology space.



Job Description


As a Big Data DevOps Engineer, you will be working on a range of highly important projects for clients worldwide, mainly focused on mixed Cloud solutions, on premises solutions and Hadoop ecosystem. These include automation of operations and managing the deployment of the platform, guide platform architecture, ensuring flexibility and scalability.


Are you ready for this kind of challenge?





  1. Get the knowledge about different solutions all over the world in BIG Data World
  2. Work primarily with Cloud technologies such as Google, Azure or AWS
  3. Work closely with a variety of external and internal vendors
  4. Finally, design, create and implement the best solutions in BIG Data environments 


  • Designing, building and maintaining of Big Data environments including Cloud solutions
  • Creating automation across the Big Data environments
  • Be active member of engineering DevOps team collaborating with customers all over the world
  • Participation in R&D projects related to Cloud computing and Big Data
  • Suggesting new standards and solutions for providing high quality of delivered services
  • Be a team technical leader in BIG Data solutions for DevOps engineers in Digital world
  • Tracking trends and latest issues related to the domain of conducted projects
  • Creating technical documentation, Create processes and procedures in the environments
  • Support deployment, customizations, upgrades and monitoring via DevOps tools


Job Requirements:


  • 3 years of experience in Linux/Unix systems including installation, configuration, networking, backups, updates and patching;
  • 2 years of experience in Big Data platform solutions, to include the following: Hadoop, HDFS, HBase, Spark
  • A very strong Java, SQL background and experience but capable of thinking in terms of networks rather than tables
  • Knowledge of cloud solutions (i.e. Amazon, Google, Azure, Oracle)
  • Knowledge of different monitoring systems (i.e. Nagios) and different automation tools
  • Knowledge of Spark streaming, Kafka, Nifi, Flume, ZooKeeper, Hive, Hawq, Cassandra, Impala
  • Enterprise Application and information integration
  • Hadoop Hortonworks Certification is a plus
  • Passion for technology & understanding how things work.
  • Ability to work occasional weekends and varied schedule. (e.g. during go-live).


Competences and skills:


  • Inspiring, motivating and positive attitude who does not hesitate taking up any challenge.
  • Has a positive mind-set who demonstrates a can-do-attitude rather than exhibiting known issues or other blockers as possible disrupting agents in meeting delivery and quality targets.
  • Ability to communicate effectively both verbally and in writing
  • Good teamwork and interpersonal skills
  • Readiness to work with Big Data (processing of terabytes of data reliably in daily manner) and Fast data (processing tenth/hundreds thousands of events per second in cluster/cloud environment)
  • Very Good English languages skills (at least B2 level)


Nice to have:


  • Knowledge of VMware/Microsoft system administration
  • Knowledge of Anisble , PostgreSQL
  • Familiarity with languages (especially Scala, Python, R)

Your Application:

If you wish to apply for this position, please click below to complete our online application form and attach your CV in either Word, rtf or text format.
Atos does not discriminate on the basis of race, religion, colour, sex, age, disability or sexual orientation. All recruitment decisions are based solely on qualifications, skills, knowledge and experience and relevant business requirements.
We are committed to making reasonable adjustments to the applications process for people with disabilities.

We take care of your personal data privacy. More information about processing your personal data within recruitment process you can find on our website: https://atos.net/pl/polska/gdpr.