At IBM Canada, we believe AI without trusted data is just hype. Our Data Platform Technical Sales team works at the intersection of enterprise data strategy and real client outcomes — helping Canada's largest organizations move from fragmented data estates to intelligent, governed platforms that actually power their AI ambitions. This isn't a role for someone who wants to read from a slide deck. You'll be the person in the room who earns credibility fast — someone who can go deep on architecture with a data engineer in the morning, then reframe the same conversation around business risk and competitive advantage for a CTO in the afternoon. You'll be part of a team covering IBM's Data Fabric, Data Security, and Database portfolio — alongside Confluent's real-time data streaming capabilities — working directly with enterprise clients across Canada to validate, demonstrate, and accelerate their confidence in IBM technology.
As a Customer Success Engineer on the Data & AI Platform team, your primary motion is pre-sales technical validation and trust-building — you are the reason a client moves from "interesting" to "I believe this works for us." You'll partner directly with IBM enterprise sellers, owning the technical narrative in deals and building lasting credibility with client architects, data leaders, and technology executives. Your responsibilities include:
- Own the technical story: Design, build and deliver compelling, tailored demonstrations of IBM's Data Fabric, Data Security, and Database portfolio that connect platform capabilities to each client's specific architecture and business outcomes. Great demos don't show everything — they show the right things.
- Lead proof engagements: Run structured POCs, workshops, and solution design sessions that de-risk client decisions and accelerate deal progression. You know how to scope these tightly, deliver on time, and leave the client with confidence.
- Build trusted relationships: Become a go-to technical resource for your client contacts — not just during the sales cycle, but through onboarding and early adoption. Clients should feel like you're on their team.
- Drive thought leadership: Contribute to internal enablement, client-facing content, and team knowledge. You stay current on the competitive landscape and can speak to where IBM's platform stands relative to alternatives — clearly and honestly.
- Translate architecture into outcomes: Bridge the gap between technical capability and business value. You don't just explain what the product does — you explain what it changes for the organization.
- 3–5 years of hands-on experience in data engineering, data architecture, or a technical consulting role — you've designed and built things, not just advised on them.
- A working understanding of enterprise data architecture patterns — data fabric, data mesh, lakehouse, streaming, and governance. You don't need to have used every platform, but you need to understand the principles and be able to discuss tradeoffs.
- Some exposure to client-facing or stakeholder communication — a consulting engagement, an internal executive presentation, or a cross-functional project where you had to explain technical decisions to non-technical people. You're ready to make this your primary arena.
- The ability to tell a story. You can take a complex data platform and make it feel clear, relevant, and urgent to the person in front of you — whether that's a data architect or a CDO.
- Intellectual curiosity and a bias for learning. This portfolio is broad and evolving. We're not expecting you to know it all on day one — we're expecting you to get there fast and stay current.
- Comfortable working in a dynamic, deal-driven environment where priorities shift and timelines are real.
- Data Fabric & Integration: Practical knowledge of data fabric concepts and integration architecture — how organizations connect, catalog, and govern data across hybrid and multi-cloud environments. Familiarity with metadata management, data virtualization, or data lineage tooling is an asset.
- Data Security & Governance: Understanding of enterprise data security principles — access control, data classification, policy enforcement, and compliance requirements in regulated industries. Experience with a data governance program or platform (any vendor) is valued.
- Database platforms: Hands-on experience with one or more enterprise database technologies — relational, NoSQL, or cloud-native. Familiarity with database modernization patterns (migration from on-prem to cloud, operational to analytical) is a strong asset.
- Real-time & streaming data: Awareness of event-driven architecture and streaming platforms (Apache Kafka, Confluent, or equivalent). You don't need to be a Kafka expert, but you should understand why real-time data pipelines matter and be able to speak to use cases.
- Modern data stack fluency: Comfort discussing technologies like Apache Iceberg, Datalake, Db2, Spark, or similar. Experience with Databricks or Snowflake environments is relevant — you'll encounter them in clients conversation.
- Demonstration readiness: The ability to build and deliver a credible technical demo from scratch. We will teach you the IBM portfolio — you need to bring the technical instinct and presentation confidence to make it land.
- Great communication skills to articulate complex technical solutions and to tie them to business values.