Senior Database Engineer(Elastic/Mongo/Hadoop) (San Francisco) Job at CatchProbe Intelligence Technologies, San Francisco, CA

eW01N0VGcnlnbjFDeGtjcG9DQzNBTk1HQ1E9PQ==
  • CatchProbe Intelligence Technologies
  • San Francisco, CA

Job Description

Senior Database Engineer (Elastic/Mongo/Hadoop)

Join to apply for the Senior Database Engineer (Elastic/Mongo/Hadoop) role at CatchProbe Intelligence Technologies

Workplace Type: Remote
Region: San Francisco, CA

Job Description

  • Must have experience with MongoDB installations, upgrades, support on MongoDB
  • Responsible for administration, maintenance, performance analysis, and capacity planning for MongoDB/Elastic/Hadoop clusters.
  • Coordinate and plan with application teams on MongoDB capacity planning for new applications.
  • Should have knowledge of using mongo base tools like mongodump, mongoexport, mongorestore, mongoimport, mongostat, mongotop
  • Must be well-versed with JSON scripting, writing queries in mongo in shell scripts and in the mongo shell
  • Should be able to support sharded clusters and perform upgrades and other config maintenance on sharded clusters
  • Must be able to address, monitor and manage capacity requirements all aspects CPU, memory and storage.
  • Must be able to assist application teams with assessment and/or resolution of performance bottlenecks observed in the MongoDB Database tier of their stack
  • Must be aware of different authentication/authorization methods used in Mongo SCRAM-SHA1, X509, LDAP including reconfiguration of instances with TLS/SSL and/or LDAP
  • Candidate must also be able to develop automated solutions for ad-hoc script execution requests, ad-hoc report generation, upgrades, installs
  • Experienced in NoSQL DB technologies
  • Must be aware of how to use Cloud manager and share relevant metrics for a given deployment when an issue arises
  • Must have experience with Docker, deploying Mongo containers running on Docker and supporting all aspects of MongoDB administration needs within a Docker container
  • Knowledge of administration and support of Hadoop systems will be an added advantage
  • Deploy Hadoop (Bigdata) cluster, comm/decommissioning of nodes, track jobs, monitor services like ZooKeeper, HBase, SOLR indexing, configure name-node HA, schedule, configuring backups & restore
  • Develop script to review logs and alert in case of long running queries
  • Demonstrable expertise in deployment and use of Postgres/MySQL, Kafka/Kinesis, etc.
  • Strong scripting experience with Python (preferred), and Shell (secondary)
  • Individually build services, and expose internal APIs for these services that allow other teams and workflows to use data infrastructure automation components.

Required Skills

  • Strong understanding of various relational and non-relational database technologies and their benefits, downsides and best use-case and help application teams to use the correct database technology based on their specific business use case
  • 5+ years installing, automating, scaling and supporting NoSQL databases such as MongoDB, ElasticSearch, Hadoop, among other emerging technologies
  • 1-2 years experience working with the databases in public clouds like Hetzner, AWS, Azure and GCP
  • Proficiency in automation
  • Knowledge of Ansible, Python, Terraform
  • Willingness and commitment to learn other database, automation, and cloud technologies
  • Great communication and collaboration skills
  • Software development experience and knowledge of modern software development processes
  • Knowledge/experience in development programming languages such as Java, Go, Node.js, HTML, CSS, Bootstrap etc. is a plus
  • Knowledge/experience on AI/ML is a big plus
  • Ability to multi-task and prioritize with little to no supervision, providing team leadership skills
  • Ability to work well under pressure
  • Consistent exercise of independent judgement and discretion in matters of significance
  • Excellent communication skills
  • Highly driven, highly involved, highly proactive
  • Datalake cluster ownership and technical point of contact for all applications on Hadoop cluster
  • Responsible for new application onboarding in Datalake by reviewing requirement and design
  • Assist existing and new applications to come up with most optimized and suitable solutions for their requirement
  • L3 point of contact for issues related to Hadoop platform

Core Responsibilities

  • Develop solutions for very complex and wide-reaching systems engineering problems, set new policies and procedures, create systems engineering and architectural documentation
  • Operating systems & disk management: provide in-depth knowledge, mentor junior team members, create basic task automation scripts
  • Database platform management: master understanding of database concepts, availability, performance, usage and configuration, set up, fix and tune complex replication
  • Storage and backup: set up, solve problems and tune complex SAN software issues, maintain policies and documentation
  • Scripting and development: develop software in several modern languages, design horizontally-scalable solutions and apply professional standards
  • Networking: recommend or help architect an entire system, perform network sniffing, understand protocols
  • Application technologies: provide recommendations and advice regarding web services, OS and storage, liaise with development, QA and business teams
  • Analyze systems, make recommendations to prevent problems, lead issue resolution activities
  • Lead end-to-end audit of monitors and alarms, define requirements for new tools
  • Apply time and project management skills to lead resolution of issues, communicate necessary information, consult with clients or third-party vendors
  • Consistent exercise of independent judgement and discretion in matters of significance
  • Regular, consistent and punctual attendance

Seniority level: Mid-Senior level

Employment type: Full-time

Job function: Information Technology

Industries: Software Development

Referrals increase your chances of interviewing at CatchProbe Intelligence Technologies by 2x

#J-18808-Ljbffr

Job Tags

Full time, Remote work,

Similar Jobs

FocusGroupPanel

Remote Data Entry Clerk Flexible Hours & Work-From-Home Job at FocusGroupPanel

 ...A remote work company is seeking a Data Entry Clerk for a flexible, entry-level position that offers the ability to work from home. Ideal candidates will have typing skills of 25+ words per minute and the motivation to work independently. This role allows you to earn extra... 

Ryan Specialty

Senior Actuarial Analyst Job at Ryan Specialty

 ...insurance partners and external actuarial teams. Perform in-depth data analysis to identify trends and patterns that impact financial risk. Develop and refine actuarial models to forecast future events and financial outcomes. Assess the financial implications of... 

Clean Harbors

Driver Class B HAZMAT Truck Driver / Equipment Operator Job at Clean Harbors

 ...Clean Harbors in Roanoke, VA is seeking a Class B CDL Driver to operate light and heavy-duty trucks/work equipment at our customer sites; some of the...  ...include Vacuum Trailer, roll-off trailers, van trailers, box trucks and high-powered vacuum loaders. This role is... 

University of Rochester

Accompanist for DANC 365 Job at University of Rochester

 ...responsibilities of this appointment will be to prepare for and provide musical accompaniment in collaboration with the instructor for DANC 365 Sansifanyi:West African Dance & Drum Ensemble in the Program of Dance and Movement in Spring 2026 from January 16 - to May... 

UPMC - Pittsburgh Medical Center

Diagnostic Technologist - Casual - Muncy Job at UPMC - Pittsburgh Medical Center

 ...Job Description Diagnostic Technologist Casual Location: UPMC Muncy Schedule: Casual| One weekend per month + one holiday per year | On-call during scheduled weekend | Additional rotating shifts to support schedule vacancies Make a Difference in Patient...