Job Description
• As a Snowflake Data Architect, you will be a designing and implementing scalable, secure, and high-performance data solutions using the Snowflake Data Cloud.
• Snowflake Implementation – including design, implementation, migration, and data management.
• Deep understanding of Snowflake’s three-layer architecture:
• Define data models and schema strategies for analytics and reporting
• Designs scalable solutions for structured and semi-structured data.
• Integrates Snowflake with modern data stacks (e.g., Fivetran, dbt).
• Implement data migration strategies and best practices.
• Apply clustering and partitioning techniques to improve query speed.
• Design and implement Snowflake architecture, RBAC, PIM, and JIT access models.
• Automate user lifecycle management and quarterly access reviews.
• Establish and enforce data classification framework (PII, PCI, PHI, financial).
• Build and maintain logging, monitoring, and anomaly detection dashboards.
• Define and implement Change Management Policy with IaC, GitHub/GitLab, and CI/CD integration.
• Lead Data Governance initiatives (metadata catalog, ownership, lifecycle management).
• Create and maintain Snowflake SOPs (access, backup/restore, data sharing, tagging).
• Manage vendor compliance (SOC 2 / SOX evidence, risk assessment, offboarding workflows).
• Conduct risk assessments, maintain Risk Register, and align with SOC 2 & SOX requirements.
• Python: Good to have experience in utilizing Snowpark. Hands-on experience with Snowflake’s interface for interacting with LLMs and building semantic models
Job Specification
• Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required.
• At least 5 years of experience in Snowflake data product development and solutions in highly complex data environments with large data volumes.
• 10+ years in Data/Cloud Engineering, with 5+ years in Snowflake architecture.
• Strong knowledge of Snowflake security, governance, RBAC, DDM, PIM.
• Experience with SOC 2, SOX, GDPR, HIPAA compliance frameworks.
• Hands-on with Terraform, dbt, GitHub/GitLab, CI/CD pipelines.
• Skilled in data monitoring, logging, and encryption practices.
• Excellent communication and documentation skills.
• An understanding of E-R data models (conceptual, logical, and physical).
• Understanding of advanced data warehouse concepts (Factless Fact Tables, Temporal \ Bi-Temporal models, etc.) a plus.
• Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.
• Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
• Self-starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities.
Required Key Skills
• Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.