Job Title: Snowflake Developer
Location: Hyderabad, India
Experience Required: 5+ Years
Technical Knowledge:
Snowflake (Warehousing, Streams, Tasks, Cloning, Time Travel), Advanced SQL, Python, dbt (data build tool), Azure Data Factory (ADF), Azure Synapse Analytics, Python Notebooks, Google BigQuery (integration and data migration), Google Cloud Storage, AWS S3, AWS Glue (for ingestion), AWS Lambda (for automation), GitLab, CI/CD Pipelines, Power BI/Tableau (for data visualization), Data Cataloging (e.g., Alation, Collibra), Data Governance (RBAC, Row-Level Security, Data Masking in Snowflake)
Role Summary:
We are looking for a Snowflake Developer with deep expertise in Snowflake, SQL, Python, and modern data engineering practices. In this role, you will be responsible for designing, developing, and optimizing scalable, high-performance solutions on the Snowflake Data Cloud. You will work closely with cross-functional teams to build efficient data models, support advanced analytics, and enable data-driven decision-making across the organization. This position is ideal for a self-driven developer who excels in fast-paced environments and is passionate about data architecture, performance optimization, and automating data workflows.
Key Responsibilities:
Design, develop, and maintain scalable and efficient data pipelines and ELT processes using Snowflake, SQL, and Python.
Build and optimize Snowflake data models (star/snowflake schemas) to support analytics, reporting, and business intelligence use cases.
Leverage Snowflake features such as Streams, Tasks, Time Travel, Cloning, and Materialized Views for building advanced data workflows.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and translate them into Snowflake solutions.
Implement and enforce data governance, access control (RBAC), data masking, and row-level security within Snowflake.
Integrate Snowflake with external systems using tools such as dbt, Azure Data Factory (ADF), AWS Glue, Python Notebooks, and cloud storage solutions (AWS S3, GCS, Azure Blob).
Ensure high performance and cost efficiency by tuning queries, optimizing warehouse usage, and managing compute resources.
Create and maintain documentation of Snowflake environments, data pipelines, schema changes, and workflows.
Collaborate with DevOps and platform teams to implement CI/CD pipelines, automate deployments, and promote best practices in version control.
Support data migration and modernization initiatives, including legacy warehouse to Snowflake transitions.
Stay updated with Snowflake's latest features and enhancements, recommending and implementing improvements.
Mentor junior developers, conduct code reviews, and contribute to knowledge-sharing and training efforts across the team.
Requirements:
5+ years of hands-on experience working with Snowflake in a data engineering or data development role.
Strong expertise in SQL, including query optimization, complex joins, CTEs, window functions, and writing efficient stored procedures in Snowflake using JavaScript or SQL.
Deep knowledge of Snowflake features such as Streams, Tasks, Time Travel, Cloning, Materialized Views, and Secure Data Sharing.
Proficient in Python for scripting, automation, and integrating with Snowflake for dynamic workflows and error handling.
Expertise in data modeling techniques (star/snowflake schema) and best practices for scalable data warehouse design.
Experience integrating Snowflake with Azure (ADF, Synapse), AWS (S3, Glue, Lambda), or GCP (BigQuery, GCS) environments.
Hands-on experience building and deploying CI/CD pipelines using tools like GitLab, GitHub Actions, or similar.
Strong understanding of data governance, access control, row-level security, data masking, and auditing within Snowflake.
Familiarity with developing automated procedures for monitoring data quality, pipeline health, and warehouse usage optimization.
Solid experience with data quality, cataloging, and metadata management tools.
Excellent communication skills and ability to work closely with cross-functional technical and business teams.
Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related technical field.
Preferred Skills:
Experience with Snowflake stored procedures, real-time data pipelines, and data observability tools.
Familiarity with Power BI/Tableau for analytics and cost optimization in Snowflake.
Exposure to multi-cloud architectures and DataOps practices.