logo

View all jobs

Senior Data Analytics Engineer

Phoenix, AZ
The Opportunity: We are looking for a Senior Data Analytics Engineer to support the design, development, and optimization of data pipelines and analytics solutions within an agile, SDLC-driven environment. This role focuses on building scalable ETL frameworks, developing complex SQL and data models, and enabling reliable reporting and analytics across enterprise data platforms.

The ideal candidate brings strong experience in data engineering, big data technologies, and analytics tooling, along with a hands-on approach to building and maintaining high-quality, production-grade data solutions.
  • Role: Senior Data Analytics Engineer
  • Experience: 6–9 Years
  • Work Location: Phoenix, AZ
  • Project Duration: 12+ Month Contract

Key Responsibilities
  • Design, build, and maintain ETL and data pipelines using Python, SQL, and big data technologies
  • Develop and optimize complex SQL queries and data models across platforms such as SQL Server and Oracle
  • Support data warehousing and data lake architectures, including schema design and entity relationship modeling
  • Build and support analytics and reporting solutions using tools such as SSIS, SSRS, Tableau, Power BI, or Qlik
  • Integrate data across systems using REST/SOAP APIs, file-based pipelines, and ETL frameworks
  • Develop and maintain PySpark/Spark-based data processing workflows
  • Collaborate within an agile development team across the full SDLC lifecycle
  • Ensure data quality, performance, and reliability across pipelines and reporting solutions
  • Identify opportunities to improve data platform performance, scalability, and maintainability

Required Qualifications
  • 6–9 years of experience in data engineering, analytics engineering, or related roles
  • Strong proficiency in SQL (MS SQL Server, Oracle) and relational data modeling
  • Hands-on experience building ETL pipelines using Python and enterprise ETL tools (SSIS/SSRS)
  • Experience working with big data technologies and distributed data processing (Spark/PySpark)
  • Solid understanding of data warehousing, data lakes, and master data management concepts
  • Experience integrating systems using APIs and data exchange frameworks
  • Strong problem-solving skills and ability to write efficient, reusable, production-grade code

Preferred Qualifications
  • Experience with Oracle PL/SQL
  • Familiarity with Airflow or workflow orchestration tools
  • Exposure to cloud-based data platforms and analytics services
  • Experience working with Apptio or financial/IT analytics platforms
  • Background supporting enterprise-scale analytics and reporting environments

Technologent is an Equal Opportunity Employer -- EEO/AA Employer/Vet/Disabled -- for reasonable accommodations, please contact us at hr@technologent.com

Technologent is a Global Provider of Edge-to-Edge℠ Information Technology Solutions and Services for Fortune 1000 and SMB companies. We offer a unique blend of business practices that are aligned to solve for top CIO concerns. Our core competencies focus on data center infrastructure, business continuity, data protection, service automation and orchestration, continuous intelligence, monitoring, connectivity, collaboration and cybersecurity. These practices are supported by our professional services, digital transformation services and financial services offerings. By providing custom solutions and services designed to fit your business needs, we enable your organization to be more agile, responsive and competitive. Technologent empowers your company to ascend to the next level in IT.

Headquartered in Irvine, CA, Technologent has offices throughout the US and proudly serves clients around the world. When partnering with Technologent, organizations benefit from the highest caliber of professionals, committed to delivering exceptional business outcomes backed by unmatched service and support.

Share This Job

Powered by