Must have skills:
Combine your technical expertise and problem-solving passion to work closely with clients, turning complex ideas into end-to-end solutions that transform our clients’ business
Translate client requirements to system design and develop a solution that delivers business value
Lead, design, develop and deliver large-scale data systems, data processing and data transformation projects
Automate data platform operations and manage the post-production system and processes
Conduct technical feasibility assessments and provide estimates for design and development of the solution
Skills & Experience:
Demonstrable experience in data platforms involving implementation of end-to-end data pipelines
Ability to Apply Snowflake best practices to Snowflake Data Warehouse. Strong development background on snowflake stored procedures using java script, SQL, data modeling, Snow flake DB setup, configuration and deployment
Hands-on experience with Azure cloud data services
Implementation experience with column-oriented database technologies, NoSQL database technologies (i.e. Cosmos DB) and traditional database systems (i.e. SQL Server, MySQL)
Experience in implementing data pipelines for both streaming and batch integrations using tools/frameworks like Azure Data Factory, Spark, Spark Streaming and python scripting etc.
Ability to handle module or track level responsibilities and contributing to tasks “hands-on” experience in data modeling, warehouse design and fact/dimension implementations
Experience working with code repositories and continuous integration
Before you apply, please check if any restrictions apply in terms of time zone or country.
This job has a geo-restriction in place: USA Only.
Tagged as: 11-50 Employees