Job Details
N/A
JPC - 10473 - Data Architect (Azure Databricks)
N/A,  California,  United States | Posted - 10/11/24

Job Description:

Title: Data Architect (Databricks)

Location: Remote

Fulltime role

Client: Tredence

 

Required Skills/Qualifications:

14-19 years of relevant experience

Bachelor's and/or master’s degree in computer science or equivalent experience.

Strong communication, analytical and problem-solving skills with a high attention to detail.

 

Desired Experience:

At least two years of experience building and leading highly complex, technical engineering teams.

Strong hands-on experience in Databricks

Implement scalable and sustainable data engineering solutions using tools such as Databricks, Azure, Apache Spark, and Python. The data pipelines must be created, maintained, and optimized as workloads move from development to production for specific use cases.

Experience managing distributed teams preferred.

Comfortable working with ambiguity and multiple stakeholders.

Comfortable working cross functionality with product management and directly with customers; ability to deeply understand product and customer personas.

Expertise on Azure Cloud platform

Good SQL knowledge

Knowledge on orchestrating workloads on cloud

Ability to set and lead the technical vision while balancing business drivers

Strong experience with PySpark, Python programming

Proficiency with APIs, containerization and orchestration is a plus

Experience handling large and complex sets of data from various sources and databases

Solid grasp of database engineering and design principles.

Experience with Unity Catalog.

Familiarity with CI/CD methods desired

Good to have Teradata Experience (not Mandatory)

 

Responsibilities:

Manage end to end delivery by Investigating problem areas, working cross-functionally with product manager & other stakeholders

Follow the Agile development methodology;  think strategically and execute methodically

Develop and manage capacity and growth projection forecasts of the environment within budgets

Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements

Drive optimization, testing and tooling to improve quality of solutions

Manage teams that build and operate high volume distributed systems in a SaaS environment

Great at devising efficient processes that increase velocity and quality

Train counterparts in the data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases.

Promote the available data and analytics capabilities and expertise to business unit leaders and educate them in leveraging these capabilities in achieving their business goals.

Accountable for operational effectiveness in performance, uptime, release management of the enterprise data platform

 

About you:

You are self-motivated, collaborative, eager to learn, and hands on

You love trying out new apps, and find yourself coming up with ideas to improve them

You stay ahead with all the latest trends and technologies

You are particular about following industry best practices and have high standards regarding quality