Remote Jobs

Technical Lead / Data Engineering specialist


Pay70.00 - 75.00 / hour
LocationRemote
Employment typeFull-Time

What's your preference?

Apply and Get Job Updates
  • Job Description

      Req#: 64457
      Technical Lead / Data Engineering specialist

      Contract - REMOTE / Work From Home

      Data Platform Modernization Project

      We are seeking a highly skilled Data Engineering specialist with below mentioned skills. The ideal candidate should be passionate about working on Data Engineering on Azure Cloud with strong focus on DevOps practices in building product for our customers. Effectively communicate, collaborate with internal teams and customer to build code leveraging or building low level design documents aligning to standard coding principles and guidelines.
      • 5+ years of experience in Azure Databricks with PySpark, Databricks workflow, Unity Catalog and Azure Cloud platform
      • 4+ years of experience in ADF (Azure Data Factory), ADLS Gen 2 and Azure SQL
      • 3+ years of experience in Python programming & package builds

      Job Description of Role / Technical Skills:
      • Data management experience handling Analytics workload covering design, development, and maintenance of Lakehouse solutions sourcing data from platforms such as ERP sources, API sources, Relational stores, NoSQL and on-prem sources using Databricks/PySpark as distributed /big data management service, supporting batch and near-real-time ingestion, transformation, and processing
      • Ability to optimize Spark jobs and manage large-scale data processing using RDD/DataFrame APIs. Demonstrated expertise in partitioning strategies, file format optimization (Parquet/Delta), and Spark SQL tuning. Familiarity with Databricks runtime versions, cluster policies, libraries, and workspace management
      • Skilled in governing and manage data access for Azure Data Lakehouse with Unity Catalog. Experience in configuring data permissions, object lineage, and access policies with Unity Catalog. Understanding of integrating Unity Catalog with Azure AD, external metastores, and audit trails
      • Experience in building efficient orchestration solutions using Azure data factory, Databricks Workflows. Ability to design modular, reusable workflows using tasks, triggers, and dependencies. Skilled in using dynamic expressions, parameterized pipelines, custom activities, and triggers. Familiarity with integration runtime configurations, pipeline performance tuning, and error handling strategies.
      • Strong experience in implementing secure, hierarchical namespace-based data lake storage for structured/semi-structured data, aligned to bronze-silver-gold layers with ADLS Gen2. Hands-on experience with lifecycle policies, access control (RBAC/ACLs), and folder-level security. Understanding of best practices in file partitioning, retention management, and storage performance optimization.
      • Capable of developing T-SQL queries, stored procedures, and managing metadata layers on Azure SQL. Comprehensive experience working across the Azure ecosystem, including networking, security, monitoring, and cost management relevant to data engineering workloads. Understanding of VNets, Private Endpoints, Key Vaults, Managed Identities, and Azure Monitor. Exposure to DevOps tools for deployment automation (e.g., Azure DevOps, ARM/Bicep/Terraform).
      • Experience in writing modular, testable Python code used in data transformations, utility functions, and packaging reusable components. Familiarity with Python environments, dependency management (pip/Poetry/Conda), and packaging libraries. Ability to write unit tests using Pytest/unit test and integrate with CI/CD pipelines.
      • Lead solution design discussions, mentor juniors, and ensure adherence to coding guidelines, design patterns, and peer review processes. Able to prepare Design documents for development and guiding the team technically. Experience preparing technical design documents, HLD/LLDs, and architecture diagrams. Familiarity with code quality tools (e.g., SonarQube, pylint), and version control workflows (Git)
      • Demonstrates strong verbal and written communication, proactive stakeholder engagement, and a collaborative attitude in cross-functional teams. Ability to articulate technical concepts clearly to technical and business audiences. Experience in working with product owners, QA, and BAs to translate requirements into deliverables
      • Prior experience in working on Agile/Scrum projects with exposure to tools like Jira/Azure DevOps. Provides regular updates, proactive and due diligent to carry out responsibilities. Communicate effectively with internal and customer stakeholders; Communication approach: verbal, emails and instant messages
      • Strong interpersonal skills to build and maintain productive relationships with team members. Provide constructive feedback during code reviews and be open to receiving feedback on your own code. Problem-Solving and Analytical Thinking; Capability to troubleshoot and resolve issues efficiently
  • About the company

      The best remote jobs for you

Notice

Talentify is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.

Talentify provides reasonable accommodations to qualified applicants with disabilities, including disabled veterans. Request assistance at accessibility@talentify.io or 407-000-0000.

Federal law requires every new hire to complete Form I-9 and present proof of identity and U.S. work eligibility.

An Automated Employment Decision Tool (AEDT) will score your job-related skills and responses. Bias-audit & data-use details: www.talentify.io/bias-audit-report. NYC applicants may request an alternative process or accommodation at aedt@talentify.io or 407-000-0000.