Staff Software Engineer, Data Platform (Contract)
Who we are:
Shape a brighter financial future with us.
Together with our members, we’re changing the way people think about and interact with personal finance.
We’re a next-generation fintech company using innovative, mobile-first technology to help our millions of members reach their goals. The industry is going through an unprecedented transformation, and we’re at the forefront. We’re proud to come to work every day knowing that what we do has a direct impact on people’s lives, with our core values guiding us every step of the way. Join us to invest in yourself, your career, and the financial world.
SoFi runs on data! We are seeking a highly motivated Staff Software Engineer to join our Data Platform team. As a Staff Software Engineer, you will work alongside our experienced team of data engineers and product managers to develop and maintain our cutting-edge data handling platform using Snowflake, dbt, Sagemaker, and Airflow. In this role you will be contributing to the long-term success of SoFi’s data vision by building out distributed systems and scalable data platforms.
As a Staff engineer on the Data Platform team at SoFi, you'll be tasked with building critical components and features. You will implement battle-tested patterns and interfaces, squash bugs, refactor code and continually grow as an engineer. The ideal candidate has a strong software engineering background and problem-solving ability along with cloud computing (AWS) and data engineering skill set with prior experience on technologies such as Snowflake, Airflow, dbt, Kafka, Spark, Dask, Python, and Tableau. Additionally, you will demonstrate SoFi’s core values by honing your skills as an effective communicator, showing personal responsibility, and setting ambitious goals. If you like working on problems with tangible and lasting impact, we would love to have you in our team!
What you’ll do:
- Lead architectural design sessions for the modern data stack, emphasizing solutions that seamlessly integrate Snowflake, Databricks, Airflow, dbt, Great Expectations, and AWS data services.
- Drive the development of advanced features within the data platform, ensuring modular, efficient, and scalable code structures optimized for the aforementioned stack.
- Spearhead rigorous code review processes, underscoring best practices, efficiency, and optimal use of Snowflake's unique capabilities and AWS data services.
- Collaborate closely with product and data governance leadership to comprehend intricate user needs and translate them into technical designs that leverage the power of Snowflake, Airflow, dbt, Terraform, DMS, Kafka and Great Expectations.
- Foster and facilitate internal technical sessions, exploring nuances of AWS data services like DMS, MSK (Kafka) , and S3, and sharing best practices for integration with the broader data stack.
- Stay at the forefront of advancements in data engineering methodologies, particularly as they pertain to Snowflake, Databricks, and other modern data tools.
- Engineer sophisticated data pipelines using dbt, Airflow, and Snowflake, with special emphasis on performance optimization and data integrity using Great Expectations.
- Architect robust data governance systems, ensuring strict data integrity, compliance, and robust metadata management within the Snowflake and AWS ecosystems.
- Design and implement advanced data anomaly detection systems within the Snowflake-dbt paradigm, utilizing Great Expectations for data quality checks.
- Leverage Python and SQL scripting proficiencies for intricate data operations, custom ETL/ELT processes, and sophisticated data transformations across the platform.
- Mentor technical team members in best practices for Snowflake, Databricks, Airflow, dbt, and AWS data services, promoting a culture of technical distinction and innovation.
What you’ll need:
- Advanced degree in Computer Science, Engineering, or an allied technical discipline.
- A minimum of 5 years in a pivotal Software/Data Engineering role, with deep exposure to modern data stacks, particularly Snowflake, Airflow, dbt, and AWS data services.
- Profound hands-on experience with Snowflake's data warehousing solutions, Databricks' analytics platform, and the orchestration capabilities of Airflow.
- Expertise in establishing data quality assurance frameworks, particularly using Great Expectations, and crafting data governance strategies tailored to modern data stacks.
- Proven skills in metadata management, distributed data architecture, and optimizing dbt models and transformations.
- Mastery in Python and SQL for complex operations within Snowflake, and AWS services like DMS, MSK (KAfka) ,Kinesis, and S3.
- Solid experience on Terraform or Cloudformation as IaC solutions.
- Demonstrable problem-solving capabilities, especially within the context of the modern data stack.
- Exceptional technical communication skills, adept at liaising with both technical peers and diverse stakeholders within a data-driven organization.
Nice to have:
- Interest in personal finance.
- Experience working on financial regulatory projects.
**Please note the following benefits only apply to full-time employees**