Salesforce (501+ Employees, 32% 2 Yr Employee Growth Rate)
To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts.
Products and Technology
Slack is looking for a Staff Data Engineer to join the Data Ingestion Team. As part of the Data Engineering organization, we build and operate the platform that ingests data into our Data Warehouse. We write software to manage the ingestion for thousands of stateful hosts and stateless real time logging events. Currently, our infrastructure handles 65PB+ of storage, processes ~900B records a day, 400+ ETL pipelines and 900+ Active Airflow DAGs. As Slack’s data grows (along with the number of customers, features and employees), the goal of the team is to build a highly scalable and resilient ingestion platform to acquire high quality data efficiently and provide easy to use workflow and orchestration capabilities for our downstream customers so that they can focus on their strengths.
You will build scalable services and tools to help partners implement, deploy and analyze data assets with a high level of autonomy and limited friction. You will play a meaningful role in making partner interactions with the Data Warehouse pleasant and productive. You will have deep technical skills, be a self-starter, detail and quality oriented, and passionate about driving data driven decisions and having a huge impact at Slack!
Here are a few blog posts that shed light into what we do here at Slack:
- Data Lineage at Slack
- Reliably Upgrading Apache Airflow at Slack’s Scale
- Introducing Data Residency for Slack
- Data Wangling At Slack
What you will be doing
- Design and develop highly scalable and resilient services/data pipelines for data ingestion and processing using modern big data technologies.
- Develop and maintain our real time analytics/low latency data access layer built on top of modern OLAP solutions
- Optimize the end-to-end workflow for data users at Slack (from crafting libraries to schedule data pipelines and access data assets).
- Automate and handle the lifecycle of data sets (schema evolution, metadata management, change and backfill management, deprecation and migration).
- Improve the data quality and reliability of the pipelines through properly monitoring and failure detection.
- Comfortably collaborate with cross functional partners and lead technical initiatives end to end.
- Be a role model and a multiplier, coaching and mentoring other engineers across the org.
- Write, review, or provide feedback on a technical design proposal from others.
What you should have
- 7+ years of software/data engineering experience, including experience with Big Data technologies, e.g. Spark, Kafka, Airflow, EMR, S3, etc.
- You have extensive experience of building and maintaining large scale ETL pipelines and in-depth knowledge of various big data frameworks and architectures.
- You are skilled at crafting and building robust distributed microservices with tools like Docker, Kubernetes, AWS ECS/EKS etc.
- Experience in real time analytics/low latency data access layer with OLAP stores such as Apache Pinot or Apache Druid is a huge plus
- You have strong dedication to code quality, automation and operational excellence: CI/CD pipelines, unit/integration tests.
- You are proficient in object-oriented and/or functional programming languages: Python, Java/Scala, Go
- You have excellent written and verbal communication and interpersonal skills; able to effectively collaborate with cross functional partners and explaining sophisticated technical concepts to non-technical stakeholders
- You have high growth expectations for yourself and your team, and a willingness to push yourself and your team to achieve them
- Bachelor’s degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience
Slack has a positive, diverse, and supportive culture-we look for people who are curious, inventive, and work to be a little better every single day. In our work together we seek to be smart, humble, hardworking and, above all, collaborative.
For Colorado-based roles: Minimum annual salary of $166,000. You may also be offered a bonus, restricted stock units, and benefits. More details about our company benefits can be found at the following link: https://www.getsalesforcebenefits.com/
If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form.
At Salesforce we believe that the business of business is to improve the state of our world. Each of us has a responsibility to drive Equality in our communities and workplaces. We are committed to creating a workforce that reflects society through inclusive programs and initiatives such as equal pay, employee resource groups, inclusive benefits, and more. Learn more about Equality at Salesforce and explore our benefits.
Salesforce.com and Salesforce.org are Equal Employment Opportunity and Affirmative Action Employers. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status. Salesforce.com and Salesforce.org do not accept unsolicited headhunter and agency resumes. Salesforce.com and Salesforce.org will not pay any third-party agency or company that does not have a signed agreement with Salesforce.com or Salesforce.org.
Salesforce welcomes all.