Receive Duplicate Activity Warning Before Submission
When a user submits an activity record, the system runs a server-side similarity check against existing activities for the same peer mentor within a configurable time window (default: same day). The check uses indexed primary signals — user_id, activity_date, and activity_type_id — and fuzzy secondary signals including duration within a 15-minute tolerance and contact_id match. If the resulting confidence score exceeds the detection threshold, the Duplicate Warning Dialog is presented before the activity is saved. The dialog displays the candidate record alongside the conflicting existing record's key details (date, duration, contact, activity type) so the user can compare both entries side by side and decide whether to proceed or cancel.
User Story
Acceptance Criteria
- Given a user is submitting an activity, when the system finds an existing activity for the same peer mentor on the same day with the same activity type, then the Duplicate Warning Dialog is shown before saving
- Given the duplicate check runs, when primary signals (user_id, activity_date, activity_type_id) match and secondary signals (duration within 15 min, contact_id) also match, then a high-confidence duplicate warning is displayed
- Given the Duplicate Warning Dialog is shown, when it renders, then it displays the conflicting existing record's date, duration, contact name, and activity type alongside the candidate record's details
- Given the duplicate check runs, when no match exceeds the confidence threshold, then the activity is saved without any warning dialog appearing
- Given the duplicate check is in progress, when the check completes, then the submission flow is not blocked for more than 2 seconds under normal conditions
Business Value
NHF explicitly identified duplicate activity registration as a compliance and financial risk: the same activity being recorded by multiple coordinators inflates Bufdir statistics, which can lead to grant clawback, audit findings, or reputational damage. Proactive detection at the point of entry prevents data quality problems before they compound into costly corrections in regulatory reporting contexts.