Job Description:
• Support the delivery of customer-facing data shares and listings across platforms
• Build an understanding of customer use cases and how Lightcast data is consumed across external data platforms
• Translate requirements into epics, user stories, and acceptance criteria
• Support backlog prioritization and ensure engineering teams are unblocked and delivering against roadmap commitments
• Maintain and publish datasets across Snowflake Secure Shares/Marketplace, Databricks Delta Sharing, BigQuery Analytics Hub, and S3 delivery
• Ensure data listings and entitlements are accurate and delivered within SLAs
• Support the definition and maintenance of data contracts, including schemas, versioning, and deprecation processes
• Monitor data quality (freshness, null values, duplicates) and flag issues
• Contribute to product launches through documentation, changelogs, and internal enablement
Requirements:
• 1–2 years of experience in product, data, analytics, or a related field
• Exposure to data platforms (e.g., Snowflake, Databricks, BigQuery)
• Basic understanding of SQL or data concepts
• Familiarity with data quality, governance, or security concepts
• Experience working in agile environments with exposure to backlog management
• Strong written communication skills, including user stories and documentation
• Ability to collaborate across engineering, data, legal, security, and GTM teams
Benefits:
• Equal opportunity workplace
• Commitment to diversity of thought and unique perspectives