Data Solution Architect

  • fulltime
  • Canada, EU, US
  • CAD 150000.0 to 210000.0 Annum

Position



OpsGuru is a global engineering and consulting group. We are experts in the container ecosystem, data processing and analytics, and cloud-native technologies. Our team is formed by network, data, security, DevOps specialists, and application developers. OpsGuru empowers customers with technology to solve their business problems and provides the tools to assure success in their digital transformation.


OpsGuru's value to our customers centers around our ability to provide deep technical guidance based on their business needs. We achieve this by assigning small, virtual teams of highly skilled individuals to each client. Within these teams, the Data Solutions Architect is responsible for providing technical expertise and leadership to Data Engineers, while also maintaining a systems view that is able to reconcile technical decisions with broader project goals. Data Solutions Architects work alongside our Principal Consultants to ensure our project deliverables meet stakeholders' needs while upholding OpsGuru's quality and operational maturity standards. 



Roles and Responsibilities



Project Delivery:

  • Provide deep technical expertise and leadership across a range of cloud data technologies. You will be the go-to person for driving tool selection, resolving complex engineering issues, and guiding best practices during engagements.

  • Lead the design and implementation of data platforms to meet customers' business requirements.

  • Lead whiteboard design sessions with internal and external team members.

  • Identify and communicate technical risks as they emerge over the course of a project.

  • Work closely with Principal Consultants to extract project requirements during technical discovery sessions, define deliverables to meet those requirements, and break down those deliverables into a technical roadmap. 

  • Lead teams of data engineers to execute project roadmaps. Provide guidance on technical tasks, priorities, and technical assistance when needed.

  • Manage scope within customer engagements. Identify changing requirements as they arise, determine their impact on scope, and ensure all stakeholders are aware and agree with the changes.

  • Maintain a close working relationship with the customer, as a "Trusted Advisor".  Set clear expectations, challenge assumptions, solicit feedback, and take ownership of project deliverables.

  • Establishing credibility and building impactful relationships with our customers


Design and Delivery:

  • Design and delivery of the following:


    • Complex ETL/ELT processes, large-scale batch and real-time stream processing solutions, and optimizing data and schemas

    • Data Lakes using native cloud tools or leveraging 3rd party platforms (for example Snowflake or Databricks)

    • AI-based solutions for NLP, speech and video recognition, anomaly and fraud detection, chatbots, etc.

    • End-to-end data solutions from ingestion pipelines, cataloging, analysis, and sharing insights (for example reports, dashboards, ML forecasts/predictions, etc.)

    • Migrations and transformations of data services such as relational databases, NoSQL databases, and Data Warehouses (heterogeneous and homogeneous)

    • Building solutions leveraging caching systems (for example Redis) and object stores (for example Amazon S3, Azure Blob Storage, Amazon Glacier, etc.)

  • Designing, building, training, and optimizing AI/ML solutions 

  • Designing and integrating MLOps solutions to support model lifecycle automation


Other Technical:

  • Developing code using Python, Java, and Scala languages

  • Architect and operationalize solutions following best practices including data governance, privacy, lifecycle management, and cost optimization

  • Maintain relevant certifications on cloud and data technologies and stay informed of key industry trends



Qualification & Experience



Technical:

  • 2+ years of experience in public cloud environments, preferably AWS

  • 2+ years of developing code using Python, Java, or Scala languages

  • Experience building complex data engineering pipelines and ETL/ELT tasks

  • Experience with relational databases (for example Oracle, SQL Server, MySQL, Postgres, MariaDB) or managed cloud RDBMS services (such as Amazon Aurora, Azure SQL, GCP CloudSQL, GCP Cloud Spanner, etc.)

  • Experience with NoSQL databases such as MongoDB, Cassandra, CouchDB, ElasticSearch, Neo4j, DynamoDB, BigTable, etc.

  • Experience building batch processing and real-time processing systems.

  • Experience building Data Warehouses and Data Lakes

  • Experience with 3+ of the following:


    • Building real-time streaming solutions leveraging technologies such as: Kafka, Kinesis, Pub/Sub messaging, event buses, or Spark Streaming

    • Designing and implementing analytics and BI solutions such as MicroStrategy,  Tibco, Qlikview, or Tableau

    • Building AI/ML solutions, including operations such as ongoing training, optimizations, and model deployments.

    • Designing, implementing, training, and optimizing ML and AI models


Consulting:

  • Proven ability to discuss and design solutions using concepts such as data lineage, data quality gates, data anonymization, data governance, data security, data replication, data caching, data lifecycle management, data catalogs, and networking

  • Understanding of requirements and impacts from compliance frameworks such as HIPAA, PCI, GDPR, PIPEDA, etc. 

  • Strong communication and presentation skills, written and verbal

  • Experience writing technical documentation

  • Experience in technical mentoring

  • Experience working in a customer-facing delivery role in a consulting or professional services environment 



As an employer, Carbon60+OpsGuru recognizes the importance of balancing our careers with other aspects of our lives, and our culture reflects this ethos - from flexible work hours to health and wellness incentives and having fun along the way.  We look for people who thrive in an environment of accountability and at times ambiguity as we adapt and grow our business.  


Carbon60+OpsGuru is an equal opportunity employer. We welcome and encourage applications from people with all levels of ability. Accommodations are available on request for candidates taking part in all aspects of the selection process. We thank all applicants for their interest in this exciting opportunity. Only candidates that meet the qualifications will be contacted for an interview. 

  • data lakes
  • data modeling
  • databricks
  • software engineer
  • schema design
  • AWS
Apply