Senior Software Engineer - Data/Etl
Title: Senior Software Development Engineer – Data/ETL Location: Remote Openings: 1 Type: Full-time Hire Company Description CLIENT helps life science companies solve for the complexities that come with digital transformation by developing end-to-end AI solutions that deliver personalized business intelligence through integrated applications and API accessible services. Healthcare innovation is best served when individuals with diverse backgrounds come together with a common purpose and clear objectives to improve patient lives. We are product strategists, engineers, data scientists and designers who are experts in our domain and passionate about our mission to accelerate innovation, collaboration and scientific discovery for life sciences. Job Description CLIENT , is searching for an experienced Senior Software Development Engineer (SDE) to work within an agile team of software architects and developers to implement and maintain secure, scalable, cloud-based data/ETL pipelines for our public and customer specific data integrations.
Technical aptitude is a must, and so is the right team-minded attitude and the ability to work interchangeably with others.
Applicants should be qualified as top-notch full-stack developers, with a passion for focusing on backend service-oriented architectures, scalable system design, databases and data models, and API-driven models. This position will report directly to the Vice President of Engineering.
As a remote-first engineering organization, having a home office setup for remote work is key.
You will be issued all the tools you need to perform well at home, a company MacBook Pro, extra monitor, iPad and other accessories needed for the work.
You will be given a budget for continuing education, online courses, conferences, books, etc.
We have regular daily standups with different teams in the engineering organization, and regular get- togethers over video conferencing to discuss best practices, plan infrastructure together, and occasional pair programming. In this position you will: Own and implement the key data ingestion, transformation & delivery pipelines for the company's flagship SaaS platform.
Live and breathe a DevOps culture on a small team of Senior SDEs and SREs that work hand-in-hand with you to deliver modern, scalable, SaaS software across multiple cloud providers.
Evolve our existing data-lake to warehouse infrastructure to accommodate the scaling needs of our platform and our clients.
Work in an agile manner with transparency and fluid communication within the engineering team and across other teams in the company.
Work with product owner to understand internal and external requirements and help the team design solutions to satisfy these.
Work with Senior SREs to contribute to the overall platform stability, availability, and quality.
Interact directly with SREs and customers to triage, debug, and remediate production issues.
Experience and Qualifications: Education: Formal higher-education degrees in STEM fields or vocational certificates related to software development are preferred, but not required.
Real-world experience and proven track records count as much, if not more.
Experience with Sidekiq, Message Queues, Custom DSLs or very similar is a must.
5+ years of professional experience developing ETL software for modern cloud environments.
8+ years of professional programming experience with languages and frameworks such as Ruby, Rails, Python, etc.
Familiar with cloud and Linux sysadmin experience – IaaS, PaaS, and container/orchestration experience is important.
Proven track record of success in high-functioning continuous delivery environment.
Working knowledge and hands-on experience with Docker and Kubernetes (or similar container/orchestration).
Expertise with a variety of relational and NoSQL databases.
MSSQL, PostgreSQL, MySQL, Couchbase, Neo4j, Document DB, DynamoDB, MongoDB, Redis preferred.
Solid understanding of modern API design patterns (REST, GraphQL, JSON, etc. )
Understanding of source control systems (git, hg, etc. )
and related workflow patterns Familiarity with Big Data systems such as Hadoop, HBase, HDFS, Spark, Kafka, or similar.
Understanding of privacy-by-design frameworks, and code-level security techniques is a plus.
Strong verbal and written communication and documentation skills is required.
#J-18808-Ljbffr
Diventa il primo a rispondere a un'offerta di lavoro!
-
Perché cercare un lavoro con PostiVacanti.it?
Ogni giorno nuove offerte di lavoro È possibile scegliere tra un'ampia gamma di lavori: il nostro obiettivo è quello di offrire la più ampia selezione possibile Ricevi nuove offerte via e-mail Essere i primi a rispondere alle nuove offerte di lavoro Tutte le offerte di lavoro in un unico posto (da datori di lavoro, agenzie e altri portali) Tutti i servizi per le persone in cerca di lavoro sono gratuiti Vi aiuteremo a trovare un nuovo lavoro