Job Details
Data Engineer (Middle+)
Data Engineer (Middle+) position. Full-time, remote. Salary: 200-230K RUB net. Requires Java/Groovy, BigData, SQL, ETL tools, Hadoop. Tasks include developing integration solutions, defining tech stack, and solving complex technical challenges.
Self-development, implementation, and support of integration solutions using technologies accepted in the team (Java, Groovy, Apache Nifi, Airflow). Defining the technology stack for specific projects and tasks. Solving technically complex problems that other engineers in the team cannot solve. Promptly responding to information about problems in the area of responsibility, completing tasks within the established deadlines. Developing and controlling the relevance of documentation for the interaction of configuration units of the big data platform. Providing reports on activities to the head of the department/manager in the manner prescribed by management. Controlling the quality of integration solutions with subsequent creation of tasks/defects for refactoring. Defining the technological strategy for the development of a project or product, working for the future. Building processes (e.g., CI/CD, code review), implementing and developing engineering practices.
- Proficiency in one of the programming languages (Java, Groovy), knowledge of OOP principles, ability to read others' code; - Experience in building projects, compiling, and deploying in Rancher (Docker); - Experience in designing, implementing, developing, and supporting integration solutions on the BigData technology stack; - Knowledge of SQL (indexes, functions, ability to read query plans, query optimization); - Experience with any relational database (Oracle, Postgres, MySQL, MsSQL, DB2, etc.); - Ability to work with Git in the console; - Knowledge of the specifics of ETL tools (Apache Nifi, Airflow, SAP BW integration buses, Talend, Informatica, SAS, etc.); - Experience working with Hadoop; - Understanding of HDFS structure, data formats; - Experience working with Hive or any other Hadoop-based storage; - Experience using project management and documentation systems; - Ability to work with architectural diagrams; - Understanding of DWH and DataLake principles for data construction and storage. Additional requirements: - Experience in Unix/Linux or Hadoop administration (HDFS, Yarn, Ranger, Spark, Zookeeper), Zabbix, Ansible.
Salary: 200-230K RUB net. Employment: as an individual entrepreneur (IP). Work schedule: full-time. Location: Remote.
Don't miss a single job
Subscribe to our Telegram channel