AVP, Tech Lead, RDL

AVP, Tech Lead, RDL
LPL Financial

APAC/Oceania, India, Hyderabad

Oops! You need to have an account to use this feature

Sign up to access features including all filters, job matching, dashboard, apply service, etc.

Compatibility Score

Compatibility Score / Job Matching

This unique feature shows a score indicating how closely this job matches the preferences you set in your profile.

Access to this feature requires signing up.

N/A
Salary

Rank

VP

Responsibility

Design/Transform

Scope

Global

Workplace

100% in office

Functions

IT

Reports to
Level

N-2

Travel Max:

0%

Posting Date

02-18-2026

Description

We are seeking a hands-on AVP, Data Lake/Lakehouse Engineer to design, build, and operate robust data lake and lakehouse solutions that enable analytics, reporting, and AI-driven products. This role will be pivotal in bridging the gap between traditional data warehouses and modern data lakes, ensuring seamless data integration, governance, and accessibility for business intelligence and advanced analytics.

AVP, Tech Lead, RDL

Key Responsibilities

  • Design, implement, and maintain scalable data lake and lakehouse architectures using cloud-native services (AWS S3, Glue, Lake Formation, Delta Lake, Snowflake, etc.)
  • Develop and optimize end-to-end data pipelines (batch and streaming) for ingesting, transforming, and storing structured and unstructured data at scale
  • Integrate diverse data sources and ensure efficient, secure, and reliable data ingestion and processing
  • Implement and enforce data governance, cataloging, lineage, and access controls (e.g., AWS DataZone / Glue Data Catalog or Unity Catalog, Collibra, Atlan)
  • Collaborate with cross-functional teams (data scientists, BI engineers, product managers) to translate business needs into reliable, observable, and governed data products
  • Drive adoption of modern data engineering frameworks (dbt, Airflow, Delta Live Tables, etc.) and DevOps practices (IaC, CI/CD, automated testing, monitoring)
  • Champion data quality, security, and compliance (encryption, PII, GDPR, HIPAA, etc.) across all data lake/lakehouse operations
  • Mentor and guide team members, contribute to platform roadmaps, and promote best practices in data engineering and Lakehouse design
  • Stay current with emerging trends in data Lakehouse technologies, open-source tools, and cloud platforms

Qualification & Requirements

We’re looking for strong collaborators who deliver exceptional client experiences and thrive in fast-paced, team-oriented environments. Our ideal candidates pursue greatness, act with integrity, and are driven to help our clients succeed. We value those who embrace creativity, continuous improvement, and contribute to a culture where we win together and create and share joy in our work.

Required

  • Proven track record of leading and developing high performing teams
  • 10+ years of experience in data engineering, software engineering, and/or cloud engineering, with at least 5 years focused on leading and establishing data lake or lakehouse transformation via AWS
  • Bachelor’s degree in Data Science, Computer science or related field; Master’s degree preferred
  • Demonstrable hands-on experience with:
    • Cloud data lake architectures: AWS S3, Glue, Lake Formation, Snowflake, or similar
    • Data lake design patterns: raw, curated, consumption zones; medallion architecture
    • Data versioning and schema evolution: e.g., Delta Lake, Apache Iceberg
    • Data governance and cataloging: including any of the following (preferred experience in multiple tools) Unity Catalog, Collibra, Atlan, AWS Glue Data Catalog
    • Programming: Python and/or SQL (production code, reusable libraries, tests)
    • Pipeline orchestration: Airflow, Step Functions, dbt, or similar
    • DevOps for data: Terraform/CloudFormation, CI/CD, monitoring, and runbook creation
  • Strong understanding of data modeling, data quality, and secure data onboarding/governance
  • Experience with both batch and real-time data processing

Core Competencies

  • Excellent communication and stakeholder management skills.

Preferred

  • Experience with Spark, Snowflake or other big data frameworks
  • AWS and/or Snowflake architect or developer certifications
  • Demonstrated use of AI/ML tools to augment engineering productivity (prompting for code generation, LLMs for docs/tests, query optimization)
  • Experience with knowledge graphs and semantic data modeling

Skills & Tools

  • AWS (S3, Glue, Lake Formation, IAM), Snowflake
  • SQL, Python
  • dbt, Airflow, Step Functions
  • Terraform/CloudFormation, CI/CD (GitHub Actions, Jenkins)
  • Observability (Dynatrace preferred, Datadog, Prometheus)
  • Data governance & security (Unity Catalog, Collibra, Atlan)
  • LLM/AI augmentation tooling (preferred)

Benefits

No information available.

Company Profile

LPL Financial
Industry

Financial Services

Revenue

$9.74B

Employees

6,900

Fortune 500 Rank

#440

Global 500 Rank

NA

View Company Profile