Apidel Technologies
Job title:
Data Engineer Level 3
Company
Apidel Technologies
Job description
Job DescriptionWe are seeking a highly skilled and motivated SeniorSoftware Engineer to join our team for a data transformation and mappingproject. The successful candidate will work on designing, building, andoptimizing large-scale data pipelines and transformation processes within theAzure cloud ecosystem. This role is critical to ensuring seamless data flow,transformation, and integration, which are vital for our organization\’sdata-driven initiatives.The ideal candidate will have deep technical expertise inAzure Synapse, SQL, Azure Data Factory, and Python, with experience designingscalable and efficient ETL (Extract, Transform, Load) processes. Bonus pointsfor candidates with experience in financial regulatory environments and anunderstanding of financial data standards and compliance requirements.Required Qualifications:5+ years of experience in software engineering or dataengineering roles, with a focus on data transformation and mapping.Extensive experience in Azure Cloud Services, includingAzure Synapse Analytics, Azure Data Factory, and other related tools.Proficiency in SQL for data querying, transformation, andoptimization.Strong Python programming skills for scripting andautomation in data workflows.Experience building and optimizing ETL pipelines forlarge-scale data integration.Proven ability to design and implement cloud-based dataarchitecture and processing systems.Strong problem-solving skills with attention to detailand a proactive attitude toward troubleshooting and optimizing data workflows.Ability to work in an agile environment, managingmultiple priorities and collaborating effectively with cross-functional teams.Preferred Qualifications:Financial regulatory experience or familiarity withfinancial data standards and compliance requirements.Knowledge of data governance, security, and compliancebest practices.Experience working with big data technologies ordistributed systems within a cloud environment.Certification(s) in Azure Data Engineering, AzureSolutions Architecture, or relevant fields.Key ResponsibilitiesKey Responsibilities:Data Transformation and Mapping: Lead the design,development, and implementation of complex data transformation and mappingpipelines using Azure Data Factory, Azure Synapse, and SQL.Azure Cloud Development: Architect and implementcloud-native solutions leveraging the Azure stack, ensuring security,scalability, and performance.ETL Pipelines: Develop, monitor, and optimize ETLprocesses for data ingestion and transformation, ensuring the data istransformed and mapped accurately between systems.SQL Expertise: Write and optimize complex SQL queries forefficient data extraction, transformation, and loading in Azure Synapse.Data Integration: Collaborate with cross-functional teamsto integrate and align data across multiple systems and platforms.Automation: Leverage Python to automate data workflows,improve data processing efficiency, and ensure the reliability of datapipelines.Data Quality Assurance: Ensure data quality and integritythroughout the data transformation process by implementing appropriate testingand validation techniques.Documentation and Best Practices: Create and maintaindocumentation, and advocate for best practices in data engineering, ensuringcompliance with internal and external data handling standards.Financial Regulatory Data: (Bonus) If applicable, designand implement solutions that comply with financial regulations and reportingstandards (e.g., SEC, FINRA, etc.), ensuring data mapping aligns withcompliance requirements.Note to VendorsTop 3 skills – Azure, Python, Synapse/SQLSoft Skills Needed – Adaptability, Curiosity and researchskills, Ability to pivotProject person will be supporting – Extracting data tofeed another system, KPF ComplianceTeam details ie. size, dynamics, locations – Small team,3 people supporting the same space – they are predominantly system analysts.Will be working closely with someone from a different team.Work Location (in office, hybrid, remote) – Preferencewould be in Cincinnati. Mostly remote but may need to come in to the officeperiodically for meetings.Is travel required – NoMax Rate if applicable – Please submit at best marketrate.Required Working Hours – Standard work hours, ESTInterview process and when will it start – Prescreeningwill be the first step of the process, candidates selected to proceed after theprescreening will havePrescreening Details – 5 interview questions, codingchallenge – CODING SHOULD BY DONE IN PYTHON, gameWhen do you want this person to start – ASAPPreferred skills
Data Engineer Level 3
Expected salary
Location
Blue Ash, OH
Job date
Thu, 21 Nov 2024 08:58:13 GMT
To help us track our recruitment effort, please indicate in your email/cover letter where (un-jobs.net) you saw this job posting.