Your answer provides context for your results — the questions and risk areas are the same regardless of what you select.
Your audience determines which risk areas apply — consumer products face the widest set of obligations.
For example: children of your users, patients of your healthcare customers, people whose content you process but who are not directly interacting with you.
Products often affect people who never signed up — those people have the same legal rights as your direct users.
Whether people use the product directly or through a staff member affects which online safety obligations apply.
Select every way your product handles personal data — even basic analytics counts in most jurisdictions.
AI includes any automated decision logic or third-party model — not just obvious features like chatbots.
Online safety obligations apply whenever users can post content, message each other, or leave reviews.
Even without data tagged as children's, minors may still use your product — and design obligations follow.
Jurisdiction doesn't change which risk areas apply — it determines which specific laws govern your obligations.