Skip to main content
Technology

Lead Data/AI Engineering

Bengaluru, India

Apply now

Job Description:

Job Title / Advertise Job Title:          LEAD DATA/AI ENGINEERING

Job Summary:

The Lead Data Engineer is responsible for architecting, developing, and optimizing robust data solutions that support the unique analytical and reporting needs of telecom finance. This role leads a team in building scalable data pipelines, integrating diverse financial and operational data sources, and implementing advanced analytics and AI models to drive business insights and efficiency. Collaborating closely with finance, IT, and business stakeholders, the Lead Data Engineer ensures data quality, governance, and compliance while leveraging cutting-edge technologies in data warehousing, cloud platforms, and automation. The ideal candidate brings deep expertise in both telecom and financial data domains, excels at solving complex business challenges, and mentors the team to deliver data-driven solutions that enable strategic decision-making and financial growth.

Roles and Responsibilities:


• Design, Development, Testing, and Deployment: Drive the development of scalable Data & AI warehouse applications by leveraging software engineering best practices such as automation, version control, and CI/CD. Implement comprehensive testing strategies to ensure reliability and optimal performance. Manage deployment processes end-to-end, effectively addressing configuration, environment, and security concerns.

  • Engineering and Analytics: Transform Data Warehouse and AI use case requirements into robust data models and efficient pipelines, ensuring data integrity by applying statistical quality controls and advanced AI methodologies.
    API & Microservice Development: Design and build secure, scalable APIs and microservices for seamless data integration across warehouse platforms, ensuring usability, strong security, and adherence to best practices.
    Platform Scalability & Optimization: Assess and select the most suitable technologies for cloud and on-premises data warehouse deployments, implementing strategies to ensure scalability, robust performance monitoring, and cost-effective operations.
  • Lead: Lead the execution of complex data engineering and AI projects to solve critical business problems and deliver impactful results.
  • Technologies: Leverage deep expertise in Data & AI technologies (such as Spark, Kafka, Databricks, and Snowflake), programming (including Java, Scala, Python, and SQL), API integration patterns (like HTTP/REST and GraphQL), and leading cloud platforms (Azure, AWS, GCP) to design and deliver data warehousing solutions.
  • Financial Data Integration:
    Oversee the integration of highly granular financial, billing, and operational data from multiple telecom systems (e.g., OSS/BSS, ERP, CRM) to ensure a single source of truth for finance analytics.
  • Regulatory & Compliance Reporting:
    Ensure that data solutions support strict telecom and financial regulatory requirements (e.g., SOX, GAAP, IFRS, FCC mandates), including automated audit trails and comprehensive reporting.
  • Revenue Assurance & Fraud Detection:
    Develop and maintain data pipelines and analytical models for revenue assurance, leakage detection, and fraud monitoring specific to telecom finance operations.
  • Real-Time & Near Real-Time Analytics:
    Architect solutions that deliver real-time or near real-time insights for finance operations, supporting decision-making in fast-paced telecom environments.
  • Cost Allocation & Profitability Analysis:
    Implement advanced data models and processes for cost allocation, margin analysis, and profitability reporting across telecom products, services, and regions.
  • Intercompany & Affiliate Billing:
    Design and manage data flows for complex intercompany transactions and affiliate billing, ensuring financial accuracy across business units and partnerships.

Shift timing (if any): 12:30 PM to 9:30 PM IST  

Location / Additional Location (if any): Bangalore, Hyderabad

Overall Experience:

  • Typically requires a minimum 15 years of progressive experience in data engineering, data architecture, or related fields.
  • At least 3–5 years of hands-on experience in the telecom or finance domain, preferably integrating and managing large-scale financial and operational data systems.
  • Demonstrated experience leading complex data projects, managing teams, and delivering end-to-end data solutions in large or matrixed organizations is highly valued.
  • Experience with cloud data platforms, big data technologies, and implementing best practices in data governance and DevOps is strongly preferred.

Primary / Mandatory skills:

  • Delivery: Proven experience in managing and delivering complex data engineering and AI solutions for major business challenges.
  • Telecom Data Domain Expertise:
    Deep understanding of telecom data structures, including OSS/BSS, CDRs, billing, customer, and product hierarchies.
  • Financial Data Modeling:
    Experience designing data models for financial reporting, revenue recognition, cost allocation, and profitability analysis specific to telecom finance.
  • Regulatory Compliance Knowledge:
    Familiarity with telecom and finance regulatory frameworks (e.g., SOX, IFRS, FCC reporting) and ability to implement compliant data solutions.
  • Data Reconciliation & Audit:
    Strong skills in building automated data reconciliation, validation, and audit trails to ensure financial integrity.
  • Data Architecture & Modeling:
    Expertise in designing scalable, high-performance data architectures (e.g., data warehouses, data lakes, data marts) and creating robust data models.
  • ETL/ELT Development:
    Advanced skills in building, optimizing, and maintaining data pipelines using modern ETL/ELT tools (e.g., Informatica, Talend, dbt, Azure Data Factory).
  • Cloud Platforms:
    Proficiency with cloud data services and platforms such as AWS, Azure, or Google Cloud (e.g., Redshift, Snowflake, Databricks, BigQuery).
  • Programming Languages:
    Strong coding ability in SQL and at least one general-purpose language (e.g., Python, Scala, Java).
  • Big Data Technologies:
    Experience with distributed data processing frameworks (e.g., Spark, Hadoop) and real-time streaming tools (e.g., Kafka).
  • Data Governance & Quality:
    Knowledge of data governance practices, data lineage, data cataloging, and implementing data quality checks.
  • CI/CD & Automation:
    Experience in automating data workflows, version control (e.g., Git), and deploying CI/CD pipelines for data applications.
  • Analytics & AI/ML Integration:
    Ability to support advanced analytics and integrate machine learning pipelines with core data platforms.
  • Leadership & Collaboration:
    Proven track record in leading teams, mentoring engineers, and collaborating with business, analytics, and IT stakeholders.
  • Problem Solving & Communication:
    Strong analytical, troubleshooting, and communication skills to translate business needs into technical solutions.

Secondary / Desired skills:

  • Data Visualization:
    Experience with BI tools such as Power BI, Tableau, or Looker for creating dashboards and visual analytics.
  • AI/ML Model Operationalization:
    Familiarity with deploying, monitoring, and scaling machine learning models in production environments (MLOps).
  • API & Microservices Development:
    Understanding of building and consuming RESTful APIs and microservices for data integration.
  • Data Security & Privacy:
    Knowledge of data encryption, access controls, and compliance with data privacy regulations (GDPR, CCPA, SOX).
  • Data Catalogs & Metadata Management:
    Experience with tools like Alation, Collibra, or Azure Purview for cataloging and managing metadata.
  • Workflow Orchestration:
    Hands-on with workflow tools (e.g., Apache Airflow, Control-M, Prefect) for scheduling and monitoring data jobs.
  • Performance Tuning:
    Skills in optimizing queries, storage, and processing for cost and speed.
  • Change/Data Release Management:
    Experience in managing data schema evolution, versioning, and deployment coordination.
  • GitHub & Copilot Proficiency:
    Proficient in using GitHub for version control, collaboration, and CI/CD pipelines; experience leveraging GitHub Copilot to enhance coding efficiency and foster team productivity.
  • DevOps for Data:
    Exposure to infrastructure-as-code (Terraform, CloudFormation) and containerization (Docker, Kubernetes) for data workloads.
  • Domain Knowledge:
    Understanding of Finance, Telecom, Retail, or the relevant business domain to better align data solutions with business needs.
  • Project Management:
    Familiarity with Agile, Scrum, or Kanban methodologies for managing data projects.
  • Stakeholder Management:
    Ability to effectively engage with non-technical users, translate requirements, and manage expectations.

Additional information (if any):

  • Leadership & Mentorship:
    Expected to mentor and develop junior engineers, foster a culture of knowledge sharing, and lead by example in adopting best practices.
  • Cross-Functional Collaboration:
    Will work closely with data scientists, business analysts, product managers, and IT teams to deliver end-to-end solutions that meet business needs.
  • Innovation & Continuous Improvement:
    Encouraged to stay current with emerging technologies and trends in data engineering, AI/ML, and cloud platforms, and to proactively recommend and implement improvements.
  • Ownership & Accountability:
    Responsible for the entire data engineering lifecycle, including architecture, implementation, monitoring, and optimization.
  • Communication Skills:
    Must be able to translate complex technical concepts into clear, actionable insights for non-technical stakeholders and leadership.
  • Change Management:
    Experience managing change in fast-paced environments and guiding teams through technology transformations is highly valued.
  • Quality & Compliance Focus:
    Commitment to data quality, security, and compliance is essential, with experience in implementing and maintaining controls and standards.
  • Business Impact:
    Expected to contribute to measurable business outcomes by enabling data-driven decision-making and supporting organizational goals.

Education Qualification:

  • Bachelor’s degree in Computer Science, Information Technology or a related field is required.
  • A Master’s degree in Data Science, Computer Science, Engineering, or a related discipline.

Certifications (if any specific):

  • Cloud Platform Certifications (AWS, Azure, GCP)
  • Data Engineering & Big Data (Databricks, CCDP)
  • Database & Data Warehousing (SnowPro, GCP)
  • General Data & AI (CDMP, AI/ML integration, Microfosft)
  • DevOps & Automation (Github, Gitlab CI/CD)
  • Relevant certifications in financial data analytics or telecom data management

Weekly Hours:

40

Time Type:

Regular

Location:

IND:KA:Bengaluru / Innovator Building, Itpb, Whitefield Rd - Adm: Intl Tech Park, Innovator Bldg

It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Job ID R-73693 Date posted 07/14/2025
Apply now

Benefits

Your needs? Met. Your wants? Considered. Take a look at our comprehensive benefits.

  • Paid Time Off
  • Tuition Assistance
  • Insurance Options
  • Discounts
  • Training & Development

Learn more about benefits

Our hiring process

Apply Now

Confirm your qualifications align with the job requirements and submit your application.

Assessments

You may be required to complete one or more assessments, depending on the role.

Interview

Get ready to put your best foot forward! More than one interview may be necessary.

Conditional Job Offer

We’ll reach out to discuss a conditional job offer and the next steps to joining the team.

Background Check

Timing is important – complete the necessary actions to proceed with onboarding.

Welcome to the Team!

Congratulations! It’s time to experience #LifeAtATT.

Check your email (and SPAM) throughout the process for important messages and next steps.

Connect today

Didn’t find what you were looking for here? Sign up for our job alerts so we can connect and share the latest.

Welcome to the Talent Network

We’ve always got a ton of awesome things going on – like the latest job openings, events and offerings. But how can you stay on top of it all? That’s easy. Just connect to our Talent Network.

An * indicates a required field.

Interested InSelect a job category from the list of options. Select a location from the list of options. Finally, click “Add” to create your job alert.

By signing up, I acknowledge that I have read AT&T's privacy notice (opens in new window) and that I wish to receive email and SMS communications. I understand that I can opt out of receiving email and SMS communications at any time.

Don't Miss Out

Join our Talent Network to be the first to know about new job openings, special announcements and behind-the-scenes information.

Skip, I’d rather go straight to the application

AT&T Info and Alerts. Max 12 messages/month Privacy Policy (opens in new window). You may opt-out at anytime by sending STOP to short code 20013. Msg & data rates may apply.

By submitting your information, you acknowledge that you have read our privacy policy (opens in new window) and consent to receive email communication from AT&T for our U.S. Talent Network.