What's your preference?
Job Description
- Req#: 3424813f045c
- Build and maintain efficient ETL workflows using Python 3, applying both object-oriented and functional paradigms.
- Write comprehensive unit, integration, and end-to-end tests; troubleshoot complex Python traces.
- Automate deployment and integration processes.
- Develop Azure Functions, configure and deploy Storage Accounts and SQL Databases.
- Design relational schemas, optimize queries, and manage advanced MSSQL features including temporal tables, external tables, and row-level security.
- Author and maintain stored procedures, views, and functions.
- Collaborating with cross-functional teams
- English level – B2 or higher
- 5+ years of proven experience as a Data engineer
- Programming
- Proficient in Python 3, with both object-oriented and functional paradigms
- Design and implement ETL workflows using sensible code patterns
- Discover, navigate and understand third-party library source code
- Author unit, integration and end-to-end tests for new or existing ETL (pytest, fixtures, mocks, monkey patching)
- Ability to troubleshoot esoteric python traces encountered in the terminal, logs, or debugger
- Tooling & Automation
- Git for version control and branching strategies
- Unix-like shells (Nix-based OS) in cloud environments
- Author CI/CD configs and scripts (JSON, YAML, Bash, PowerShell)
- Cloud & Serverless Patterns
- Develop Azure Functions (HTTP, Blob, Queue triggers) using azure-functions SDK
- Implement concurrency and resilience (thread pools, tenacity, rate limiters)
- Azure SDKs & Services
- Deploy and configure:
- Functions, Web Apps & App Service Plans
- Storage Accounts, Communication Services
- SQL Database / Managed Instance
- Data Security and Reliability
- Maintain strict secrets, access discipline
- Implement data quality checks and validation steps
- Database Administration
- Relational data modeling & schema design
- Data partitioning strategies & temporal tables (system-versioned)
- Query performance tuning (indexes, execution plans)
- Selection of optimal data types
- Complex T-SQL (windowing, CTEs, advanced joins)
- Advanced MSSQL features (External Tables, Row-Level Security)
- SQL Objects & Schema Management
- Author and maintain tables, views, Stored Procedures, Functions, and external tables (polybase)
- Strong analytical and problem-solving skills, with meticulous attention to detail
- Strong technical documentation skills
- Programming
- Develop and maintain custom API client modules with robust retry/error-handling/throttling
- Tooling & Automation
- Author Docker files and Docker-Compose stacks
- Cloud & Serverless Patterns
- Use Azure Functions Core Tools for local dev and testing
- Integrate custom logging and monitoring frameworks (Application Insights, custom metrics)
- Azure SDKs & Services
- Implement and maintain Authentication methods using Azure Identity Systems (DefaultAzureCredential and MSAL)
- Work with Key Vault (secrets, certificates), Storage (blob, container, queue) and Communication Services
- Manage resources via ARM templates and Azure DevOps pipelines
- Data Security and Reliability
- Enforce sensible type and schema validation (pedantic)
- Build reusable transformation utils (key-renaming, field-splitting, list-unpacking, "string to native" type coercions)
- SQL Objects & Schema Management
- Author and maintain tables, views, Stored Procedures, Functions, and external tables (polybase)
- Implement Multiple types of Slowly Changing Dimensions (SCDs)
- Define and utilize User-Defined Types (UDTs) and table-valued parameters
- Microsoft certifications
Position Name: Data Engineer
Reports to: Cloud DevOps Manager
Location/Type: Full-time RemoteAtlas Technica's mission is to shoulder IT management, user support, and cybersecurity for our clients, who are hedge funds and other investment firms. Founded in 2016, we have grown year over year through our uncompromising focus on service.
We value ownership, execution, growth, intelligence, and camaraderie. We are looking for people who share our Core Values, thrive, and contribute to this environment while putting the customer first. At Atlas Technica, we offer a competitive salary, comprehensive benefits, and great perks to our global Team. We strive to maintain a professional yet friendly environment while promoting professional and career development for our Team Members. Join Atlas Technica now!
As a Data Engineer, you will be responsible for designing, implementing, and maintaining robust data pipelines and cloud-native solutions that support scalable analytics and operational efficiency. This role requires deep expertise in Python programming, Azure cloud services, and SQL-based data modeling, with a strong emphasis on automation, reliability, and security.
Responsibilities:
Requirements:
Desirable Qualities:About the company
Notice
Talentify is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.
Talentify provides reasonable accommodations to qualified applicants with disabilities, including disabled veterans. Request assistance at accessibility@talentify.io or 407-000-0000.
Federal law requires every new hire to complete Form I-9 and present proof of identity and U.S. work eligibility.
An Automated Employment Decision Tool (AEDT) will score your job-related skills and responses. Bias-audit & data-use details: www.talentify.io/bias-audit-report. NYC applicants may request an alternative process or accommodation at aedt@talentify.io or 407-000-0000.