Senior BI Engineer

Job Description

We are seeking a Senior BI Engineer to support and enhance our analytics, reporting, and cloud‑based data engineering capabilities within a modern Azure and Lakehouse environment. This role is ideal for a technically strong, detail‑oriented professional who can work across BI development, data pipelines, and cloud platforms while collaborating with cross‑functional teams to deliver reliable, scalable analytics solutions. You will contribute to dashboard development, data transformations, and operational excellence across our data ecosystem.

Location: Lewisville, TX

Position Type: Full-Time

Hours: 9:00 am to 5:00 pm Weekdays (Monday – Friday)

Key Responsibilities

  • Design enterprise-grade semantic models, build complex DAX measures, and develop highly interactive Power BI reports.
  • Implement Row-Level Security (RLS), optimize data models, and enhance report performance using best practices such as aggregations, composite models, and query reduction.
  • Develop high-performing Tableau dashboards using appropriate strategies (extracts vs. live), LOD expressions, optimized calculations, data source filters, and workbook performance tuning techniques.
  • Design and maintain star/snowflake schemas, build conformed dimensions, and support standardized KPI definitions across business units.
  • Integrate and model data from SAP HANA, SharePoint, Databricks Delta tables, and Lakehouse data (Delta/Parquet).
  • Implement data certification, lineage documentation, metadata standards, and usage analytics; collaborate closely with data engineering teams for reliable and scalable data refresh pipelines.
  • Profile and tune queries, implement incremental refresh strategies (Power BI/Tableau), and ensure efficient data refresh and distribution.
  • Gather business requirements, build prototypes, deliver stakeholder demos, and create reusable dashboard templates and visual frameworks.
  • Develop Databricks notebooks for batch and streaming workloads; apply Delta Lake best practices including ACID transactions, Z-Ordering, OPTIMIZE, and VACUUM; implement Gold and Platinum part of Medallion architecture.
  • Orchestrate and monitor data workflows using Azure Databricks Jobs and Fabric/ADF pipelines when applicable.

Technical Experience

Minimum Degree Requirement: Bachelor’s degree in Computer Science, Information Technology, Software Engineering, or a closely related field.

Minimum work Experience:

3 years total professional experience in software engineering.

Minimum years of experience must include the following:

  •  3 years of hands‑on experience with Power BI for dashboarding and reporting.
  • 2 years of experience using Tableau to build interactive visualizations.
  • 1 year of experience working with Microsoft Azure cloud services.
  • 2 years of practical experience using Databricks for data engineering workloads.
  • 1 year of exposure to Lakehouse architecture principles.
  • 2 years of experience implementing and managing Delta Lake for data reliability.
  • 1 year of experience with Azure DevOps for CI/CD and version control.
  • 2 years of experience working with SAP HANA for data modeling and analytics.
  • 3 years of experience writing SQL/PLSQL for data querying and development.
  • 2 years of experience using JIRA for Agile project tracking and issue management.

Interested in joining our team? Send your updated resume with the position name in the subject line to

info@texplorers.com

If your profile matches our current openings, our team will get in touch with you.

Upload resume
Max file upload size : 2MB

Please fill the to form below form to get callback from ou team