Database Architect

Job description:

Define, drive, manage (and implement part of) enterprise data strategy and roadmap. Create architecture diagrams, and technology navigation map. Recommend and design system architectures to align with strategic business objectives. Lead data governance, review all data and database schema changes, and implement as needed. Work with MDM / Metadata repositories, Metadata lifecycle management. Map various Enterprise Metadata models to Cerebri internal models. Work with data scientists and analysts to put machine learning models into production (this is rollup the sleeves). Build tools, frameworks and dashboards to support running experiments and analyses. Write code to develop new software products and/or features, manage individual project priorities, deadlines and deliverables.


    • Experience setting up and managing multi-TB Data warehouses, Distributed Relational / NoSQL / Graph / Hive Data Clusters.
    • Background building data pipelines.
    • Understanding of security technologies: PKI, Crypto, identity management.
    • Design of data access APIs (see note on GraphQL).
    • Design of analytics capture and presentation.
    • Optimization of database design and SQL/NoSQL data-stores for latency, performance.
    • Experience with data curation (ETL, CRUD, audit) processes to transform data from a variety of data sources to normalized forms.
    • Understanding of microservices architecture and Docker infrastructure.
    • Experience building data pipelines leveraging open source technologies like Kafka, Hadoop, Hive, and Spark.
    • Experience working with RDMS databases (e. g. PostgreSQL or SQL Server), managing connection-pools, performance tuning and optimizations).
    • 7+ years experience developing large scale, highly available distributed systems.
    • 7+ years programming in SQL and python.
    • BS / MS / PhD in in Computer Science or related engineering field.
    • Strong sense of ownership, passion to build quality products for massive scale in collaborative, agile environment and excitement to learn.

Nice to haves...

    • Experience with data governance and managing large data schemas.
    • Experience with international standards for security and privacy (HIPAA, Privacy Shield, GDPR, PCI).
    • Demonstrated hands-on experience with blockchain technology, ideally with Hyperledger Fabric or Ethereum Enterprise implementation.
    • Background building data pipelines leveraging Azure/Microsoft technologies e. g. Azure Data Lake, HDInsights.
    • GraphQL .

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.

More Jobs

SAP CAR Architect/Lead
Seattle, WA Applexus Technologies
SAP BASIS S4 HANA Architect/Consultant
Seattle, WA Applexus Technologies
SAP Fiori / ODATA Architect/Lead
Seattle, WA Applexus Technologies