Intuitive Surgical招聘Data Engineer(附组员内推信箱)

Intuitive Surgical Inc., an American corporation that develops, manufactures, and markets robotic products, most notably with the da Vinci Surgical System, is hiring!

Job Title: Software/Data Engineer – Data/ML Pipeline
Department: Advanced Product Department
Reports to: Manager, Data Engineering

Email: isurgical_jobs@duoduo.io

Company Description:

Intuitive Surgical designs and manufactures state-of-the-art robot-assisted systems for use in minimally invasive surgery. Joining Intuitive Surgical means joining a team dedicated to using technology to benefit patients by improving surgical efficacy and decreasing surgical invasiveness, with patient safety as our highest priority.

Primary Function of Position:

Contribute broadly to the engineering team which responsible for advanced analytics and new technology development focused on surgical workflow and performance for next generation robotic surgery platforms.

The successful candidate must excel in a high-energy, focused, small-team environment, be able to initiate and drive new research directions, and have a commitment to high research quality. A strong sense of shared responsibility and shared reward is required.

As part of the New Product Development team, immediate responsibilities include:

  • Develop and support tools that perform the analysis of surgeries from inside of a Machine Learning cloud pipeline.
  • Design and maintain relational database and distributed system data storage for various scenarios
  • Work closely with Machine Learning team and Engineering team, understand various needs and bring modernized/standardized support to the team along with the development
  • Provide efficient and scalable ETL solution to videos, system data and meta data
  • Test and maintain incrementing databases of videos, system data, and meta data
  • Work closely with research data scientists throughout analysis, visualization, and monitoring efforts
    • Support multiple engineering and analytics teams in collecting, accessing and analyzing data, as well as troubleshooting dataflow

Additional responsibilities include:

  • Keep current with our software/hardware deployment and cybersecurity strategies and understand overall trends in AI, ML and Cloud technologies

  • Work with product development teams to bring data products to market

  • Continuously explore optimal data recording solutions, database designs, storage solutions, and analysis environments Skill/Job Requirements: Competency Requirements: (Competency based upon education, training, skills and experience). In order to perform the responsibilities of this position, the individual must have:

  • M.S. in Computer Science, Software/Computer Engineering, or Applied Math with minimum (3) years industry experience with software/data engineering, or B.S. degree with minimum (6) years industry experience.

  • Excellent experience with distributed Airflow or similar distributed cluster

  • Experience with Cloud storage, distributed system/Cluster, ETL in scale

  • Experience with GCP environment is ideal

  • Hands-on experience with Python, GNU, C++/C, OOP, RDD, TDD, DDD

  • Hands-on experience with data-processing /ML pipeline

  • Experience with log monitoring solution

  • Proficient in SQL-based and Non-SQL-based technologies

  • Ability to travel domestically and internationally (10-20%)

  • Demonstrate excellent communication skills both written and verbal

Highly Desirable Knowledge, Skills, and Experience:

  • Interested in early research and development/prototype efforts
  • Excellent experience with distributed system/cluster, ETL, Docker and job- scheduler
  • Proficient in SQL and Non-SQL
  • Experience with video recording, compression, and streaming