Role for you to resource for Xcelocloud.com
("MS Fabric” experience is the key to the role.)
Type: Contract, ~4 months (might extend)
Pay: $25 - $30 per hour
Hours: 30-40 per week
Schedule: US Day shift
Department: PMO
Location: remote / India
Key Responsibilities:
Design, build, and maintain automated data ingestion pipelines using Microsoft Fabric tools (Azure Data Factory / Fabric Pipelines).
Develop robust ELT processes to cleanse, transform, and model data into OneLake.
Support migration of legacy systems including:
On-prem SQL Server databases
The Illuminate Assessment Data Mart from Google BigQuery and dbt Cloud
Collaborate with the Lead Data Architect and BI teams to align data models with analytics use cases.
Leverage Spark Notebooks and Dataflows Gen2 to implement scalable transformation logic.
Create reusable pipeline templates, enforce version control via Git, and ensure quality via automated testing.
Assist with integrating the curated data sets into Tableau and/or Power BI.
Support post-deployment stabilization and knowledge transfer to the client’s internal team.
Contribute to documentation of pipelines, data dictionaries, and support guides.
Required Skills & Qualifications:
4+ years of experience in a data engineering or ETL development role.
Strong proficiency in:
Azure Data Factory / Fabric Pipelines
SQL (T-SQL and/or Spark SQL)
Python for scripting and transformation
Spark Notebooks (PySpark or equivalent)
Experience working in cloud data ecosystems, especially Microsoft Fabric, Azure Synapse, and OneLake.
Familiarity with modern data modeling techniques and ELT best practices.