Azure Data Engineer [Romania]


 

$ads={1}

Minimum Total experience of 10 years and 5 years of relevant experience on below mentioned skills.

Git, Gitlab

· The candidate must have thorough knowledge about
- collaboration in SW development teams
- shared software-repositories and their concepts

· The candidate must have experience with GitLab

· The candidate must have experience on how to manage code locally and remote using features like -
- branching
- merging
- versioning and releases
- integration with the Code Pipeline

JIRA

· The candidate must have experience with Jira
including creating Kanban boards, personalized dashboard, reports, filters

· The candidate must have experience working with Jira in agile project setups

· Experience with workflow configuration

Tools for Reporting and Analysis

· The candidate must have working experience with at least one of the following tools for reporting and analysis: SSIS, SSAS, SSRS / Power BI

· The candidate must be able to explain the purpose and main functionalities of the cited tools

· The candidate must be able to explain how to model and prepare (data mart) interfaces to be accessed by Tableau or SSAS/SSRS in a most convenient way for those tools (minimizing or avoiding the effort for post-processing of data in the reporting tools, enabling fast report development)

Code Standards / Code Review

· The candidate should be able to lead and guide code reviews and compare the results to publicly known best practices or TEF guidelines

· The candidate should be able to communicate results to Developers

· The candidate should be able to find measures and track success

Prototyping

· The candidate should be able to compile quick prototypes to ensure that solution designs as well as 1st draft ideas work

· The candidate should be able to guide developers to extend the results to working implementations on production

Azure Data Factory and Synapse pipelines development

· The candidate should have understanding of how DF pipelines work and how they integrate with other Azure service

· The candidate should be able to estimate efforts, create a design and implement

· The candidate should be able to create a solution design for a set of requirements

· The candidate should be able to understand using a Databricks and/or Synapse Delta Lake based Staging Layer and Business Layer for BI purposes

· The candidate should be able to understand data modelling and structural design

Spark development

· The candidate should have practical expertise in PySpark and how it interacts and is used in Synapse and Databricks Notebooks

· The candidate should have knowledge about spark design, data structures and mpp flows

· The candidate should understand configuration of spark environment, worker nodes, in-memory facilities

· The candidate should have expertise in Scala Spark development and providing those functionalities via SQL or PySpark API

· The candidate should have expertise in Spark SQL and how it interacts and is used in Databricks or Synapse Notebooks

Kafka

· The candidate should have knowledge and experience in projects
- on how to integrate Kafka with Spark and other persistence services
- how a Kafka cluster works and how it is configured (e.g. for retention settings)
- professional experience with Kafka
- has worked with the Confluent Stack (Kafka Streams, KSQL, Connect)
- can explain the principle of Kafka works and what it can be used for
- can give examples from former projects

Azure DevOps

· The candidate should have working experience with the CI/CD in DWH development process
- continuous integration
- continuous delivery / deployment

· The candidate should be able to describe tools that support automatization of testing and delivery processes

· The candidate should be able to explain the benefits and limitations of the concepts applied to DWH development process and give examples from own practical experience

Databases in Azure

· The candidate should have ability to compare all Azure Databases to other cloud/onprem services

· The candidate should know strengths and shortcomings for all Azure Databases

· The candidate should have ability to implement DDLs, DMLs and queries for given tasks using SQL and T-SQL

DB development T-SQL

· Experience with T-SQL or comparable SQL dialects, preferably on MS Databases

RDBMS Tools Cloud

· Experience in working with front end management tool for database systems like
- Azure Databricks and Azure Synapse
- Integration into Azure Data Pipelines

Data Modelling Tools

· The candidate should have working experience with at least one modelling tool, and he should be able to explain -
- benefits of using data modelling tools
- differences of conceptual, logical, physical data models
- reverse engineering (why / when / how)
- generating incremental DDLs (why / when / how)
- repository configuration (default / customized settings)

· The candidate should be able to configure and apply templates for new entities and DDLs

DWH Dev - layer concepts

· The candidate should have working experience in projects:
- layered DWH architecture: why is it used, what are the benefits
- version control

· Experience with different sourcing strategies (db connections, replication services, sourcing frameworks, file-based / ext.tables)

· Experience with integration of data from different sources into the same target data model in source-abstracted and highly-interpreted way. Examples for such projects are expected.

· The candidate should be able to give examples where he helped improve layer DWH-structures in projects

DWH Dev - data modelling

· Working experience in projects of data modelling related activities and should be able to explain:
- 3NF / star / snowflake / dimensional modelling principles
- why/ when used (advantages and disadvantages)
- resolving n:m entity relations
- source-specific vs source-abstracted modelling and data integration
- the meaning and purpose of facts, dimensions, flat hierarchies, degenerated dimensions etc.
- agile modelling approaches
- differences in Inmon vs Kimball modelling approaches
- can explain and give examples for event- and state-based modelled entities
- can explain difference in conceptual / logical / physical data modelling
- dealing with version control / historization / time-variance / SCD2
- applying the modelling experience for different use cases; can discuss the cases and explain why this approach was optimal

· The candidate should be able to give examples from project-experience where this type of data-modelling was used; can compare it to other modelling-approaches and describe advantages and disadvantages of the chosen solution

· Ability to analyse existing data models without explicit physical references and to do reverse-engineering of the existing data model

Requirements engineering

· The candidate should have knowledge about the requirement phase, its purpose, the stakeholders involved, requirement documentation/specification

· Can refer to practical examples explaining the use and need of requirement specification within development process

· Experience with
- meeting the stakeholders
- documenting the requirements
- requirement analysis and validation
- considering non-functional and legal requirements
- changing requirements
- negotiating and prioritization of the requirements
- leading the requirement engineering process

Project Management

· The candidate should be capable of -

- leading of interdisciplinary teams (developer, analysts etc)
- experience with management of near-shore center or geografically distributed team of developers
- experience with document management systems
- experience with management of multiple projects in parallel:
- defining and clarifying projects scope
- creating project plan
- breaking down the project scope into smaller work packages
- specifying the work packages
- preparing and hand-over of work specification
- defining the acceptance criteria, code review and quality check
- involvement of and communication to project stakeholders

· The candidate should be able to explain the key properties of main classical, agile or mixed project management approaches (Waterfall, Scrum, Kanban etc) and refer to own practical experience

· The candidate should be able to explain specifics of leading a team of developers from a near shore center or working with geographically distributed teams

Job Type: Full-time

Salary: 226,962.00RON - 246,962.50RON per year

Ability to commute/relocate:

  • București, Ilfov: Reliably commute or planning to relocate before starting work (Required)

$ads={2}


 

.

Post a Comment

Previous Post Next Post

Sponsored Ads

نموذج الاتصال