-
Notifications
You must be signed in to change notification settings - Fork 0
Automated Test: ecosystem-sync-integration-after #326
Automated Test: ecosystem-sync-integration-after #326
Conversation
📝 WalkthroughWalkthroughThis pull request introduces an Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
🤖 Fix all issues with AI agents
In `@src/sentry/integrations/mixins/issues.py`:
- Around line 410-418: The example integration's sync_status_outbound
implementation doesn't match the abstract signature; modify the
sync_status_outbound method in the example integration to accept the full
signature (external_issue, is_resolved, project_id, assignment_source:
AssignmentSource | None = None, **kwargs), and update its body to handle or
safely ignore the assignment_source and any extra kwargs (e.g., preserve
existing behavior but accept the new parameters) so it conforms to the abstract
method contract.
In `@src/sentry/integrations/services/assignment_source.py`:
- Line 18: The queued attribute on AssignmentSource is using timezone.now() at
import time, so change its default to use a factory instead: update the
AssignmentSource definition to replace the evaluated default (queued: datetime =
timezone.now()) with a dataclass default_factory (e.g., using
dataclasses.field(default_factory=timezone.now)) and add the import for field;
this ensures queued is set per-instance at creation time rather than once at
module load.
In `@src/sentry/integrations/tasks/sync_assignee_outbound.py`:
- Around line 53-61: AssignmentSource.from_dict can return None on
deserialization errors which causes parsed_assignment_source to be None and
bypass should_sync’s cycle-prevention; wrap the deserialization in a try/except
and ensure that when assignment_source_dict exists but parsing fails you create
and pass a non-None fallback AssignmentSource (e.g., a minimal instance with
source or raw data preserved and datetime=None) instead of leaving
parsed_assignment_source as None so installation.should_sync(...) and
installation.sync_assignee_outbound(...) still receive a non-None assignment
source and cycle-prevention runs; reference AssignmentSource.from_dict,
parsed_assignment_source, should_sync, and installation.sync_assignee_outbound
when making the change.
In `@tests/sentry/integrations/services/test_assignment_source.py`:
- Around line 29-38: Update the test_to_dict to perform a round-trip
serialization: call source.to_dict(), then pass that dict into
AssignmentSource.from_dict(...) and assert the resulting AssignmentSource equals
the original (or at least matches key fields like source_name, integration_id,
queued timestamps). This will exercise datetime serialization/deserialization
paths in AssignmentSource.to_dict and AssignmentSource.from_dict and catch the
datetime issue; use the class methods AssignmentSource.to_dict and
AssignmentSource.from_dict and compare relevant attributes rather than relying
solely on dict presence checks.
- Line 13: Rename the mistyped test function test_from_dict_inalid_data to
test_from_dict_invalid_data; update the function definition and any references
to the old name (e.g., in tests or test markers) so the test name is correct and
consistent with expected naming for the assignment source tests.
| @abstractmethod | ||
| def sync_status_outbound(self, external_issue, is_resolved, project_id, **kwargs): | ||
| def sync_status_outbound( | ||
| self, | ||
| external_issue, | ||
| is_resolved, | ||
| project_id, | ||
| assignment_source: AssignmentSource | None = None, | ||
| **kwargs, | ||
| ): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Find all concrete implementations of sync_status_outbound
rg -n "def sync_status_outbound" --type=py -A 8Repository: admin-coderabbit/coderabbit-sentry-mirror
Length of output: 5045
🏁 Script executed:
#!/bin/bash
# More thorough search for all sync_status_outbound implementations
rg -n "sync_status_outbound" --type=py | grep "def " | head -20Repository: admin-coderabbit/coderabbit-sentry-mirror
Length of output: 1581
🏁 Script executed:
#!/bin/bash
# Find where sync_status_outbound is called with assignment_source
rg -n "\.sync_status_outbound\(" --type=py -B 2 -A 5 | head -80Repository: admin-coderabbit/coderabbit-sentry-mirror
Length of output: 4424
🏁 Script executed:
#!/bin/bash
# Search for calls that pass assignment_source parameter
rg -n "assignment_source" --type=py | grep -E "(sync_status_outbound|call)" | head -20Repository: admin-coderabbit/coderabbit-sentry-mirror
Length of output: 67
🏁 Script executed:
#!/bin/bash
# Search for assignment_source usage broadly
rg -n "assignment_source" --type=py -B 3 -A 3Repository: admin-coderabbit/coderabbit-sentry-mirror
Length of output: 13137
🏁 Script executed:
#!/bin/bash
# Check the example integration implementation and its parent class
cat -n src/sentry/integrations/example/integration.py | sed -n '150,175p'Repository: admin-coderabbit/coderabbit-sentry-mirror
Length of output: 1078
Example integration's sync_status_outbound method is missing the assignment_source parameter.
The sync_status_outbound implementation in src/sentry/integrations/example/integration.py:161 has signature def sync_status_outbound(self, external_issue, is_resolved, project_id): which doesn't accept the assignment_source parameter or **kwargs added to the abstract method. This violates the abstract contract and could cause runtime failures if assignment_source is passed. Update it to match the abstract signature.
🤖 Prompt for AI Agents
In `@src/sentry/integrations/mixins/issues.py` around lines 410 - 418, The example
integration's sync_status_outbound implementation doesn't match the abstract
signature; modify the sync_status_outbound method in the example integration to
accept the full signature (external_issue, is_resolved, project_id,
assignment_source: AssignmentSource | None = None, **kwargs), and update its
body to handle or safely ignore the assignment_source and any extra kwargs
(e.g., preserve existing behavior but accept the new parameters) so it conforms
to the abstract method contract.
| class AssignmentSource: | ||
| source_name: str | ||
| integration_id: int | ||
| queued: datetime = timezone.now() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: timezone.now() is evaluated once at class definition time, not per-instance.
timezone.now() is called when the module is imported, and that single value becomes the default for every AssignmentSource instance. All instances created without an explicit queued argument will share the same stale timestamp.
🐛 Proposed fix: use `default_factory`
-from dataclasses import asdict, dataclass
+from dataclasses import asdict, dataclass, field
from datetime import datetime
from typing import TYPE_CHECKING, Any
from django.utils import timezone
...
`@dataclass`(frozen=True)
class AssignmentSource:
source_name: str
integration_id: int
- queued: datetime = timezone.now()
+ queued: datetime = field(default_factory=timezone.now)🧰 Tools
🪛 Ruff (0.14.14)
[warning] 18-18: Do not perform function call timezone.now in dataclass defaults
(RUF009)
🤖 Prompt for AI Agents
In `@src/sentry/integrations/services/assignment_source.py` at line 18, The queued
attribute on AssignmentSource is using timezone.now() at import time, so change
its default to use a factory instead: update the AssignmentSource definition to
replace the evaluated default (queued: datetime = timezone.now()) with a
dataclass default_factory (e.g., using
dataclasses.field(default_factory=timezone.now)) and add the import for field;
this ensures queued is set per-instance at creation time rather than once at
module load.
| parsed_assignment_source = ( | ||
| AssignmentSource.from_dict(assignment_source_dict) if assignment_source_dict else None | ||
| ) | ||
| if installation.should_sync("outbound_assignee", parsed_assignment_source): | ||
| # Assume unassign if None. | ||
| user = user_service.get_user(user_id) if user_id else None | ||
| installation.sync_assignee_outbound(external_issue, user, assign=assign) | ||
| installation.sync_assignee_outbound( | ||
| external_issue, user, assign=assign, assignment_source=parsed_assignment_source | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Deserialization failure silently disables cycle prevention.
If AssignmentSource.from_dict returns None (e.g., due to the datetime field not deserializing correctly from JSON), parsed_assignment_source will be None. This means should_sync will skip the cycle-prevention guard, allowing the very sync loops this feature is designed to prevent.
This is a downstream consequence of the serialization issue in to_dict/from_dict. Once the datetime round-trip is fixed in AssignmentSource, this path will work correctly.
🤖 Prompt for AI Agents
In `@src/sentry/integrations/tasks/sync_assignee_outbound.py` around lines 53 -
61, AssignmentSource.from_dict can return None on deserialization errors which
causes parsed_assignment_source to be None and bypass should_sync’s
cycle-prevention; wrap the deserialization in a try/except and ensure that when
assignment_source_dict exists but parsing fails you create and pass a non-None
fallback AssignmentSource (e.g., a minimal instance with source or raw data
preserved and datetime=None) instead of leaving parsed_assignment_source as None
so installation.should_sync(...) and installation.sync_assignee_outbound(...)
still receive a non-None assignment source and cycle-prevention runs; reference
AssignmentSource.from_dict, parsed_assignment_source, should_sync, and
installation.sync_assignee_outbound when making the change.
| result = AssignmentSource.from_dict(data) | ||
| assert result is None | ||
|
|
||
| def test_from_dict_inalid_data(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Typo in test name: inalid → invalid.
Proposed fix
- def test_from_dict_inalid_data(self):
+ def test_from_dict_invalid_data(self):📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def test_from_dict_inalid_data(self): | |
| def test_from_dict_invalid_data(self): |
🤖 Prompt for AI Agents
In `@tests/sentry/integrations/services/test_assignment_source.py` at line 13,
Rename the mistyped test function test_from_dict_inalid_data to
test_from_dict_invalid_data; update the function definition and any references
to the old name (e.g., in tests or test markers) so the test name is correct and
consistent with expected naming for the assignment source tests.
| def test_to_dict(self): | ||
| source = AssignmentSource( | ||
| source_name="foo-source", | ||
| integration_id=123, | ||
| ) | ||
|
|
||
| result = source.to_dict() | ||
| assert result.get("queued") is not None | ||
| assert result.get("source_name") == "foo-source" | ||
| assert result.get("integration_id") == 123 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Add a round-trip test (to_dict → from_dict) to catch serialization issues.
A round-trip test would validate that from_dict(source.to_dict()) produces an equivalent AssignmentSource. This is the actual usage pattern (serialize for Celery, deserialize in the task) and would catch the datetime serialization issue flagged in the source module.
🤖 Prompt for AI Agents
In `@tests/sentry/integrations/services/test_assignment_source.py` around lines 29
- 38, Update the test_to_dict to perform a round-trip serialization: call
source.to_dict(), then pass that dict into AssignmentSource.from_dict(...) and
assert the resulting AssignmentSource equals the original (or at least matches
key fields like source_name, integration_id, queued timestamps). This will
exercise datetime serialization/deserialization paths in
AssignmentSource.to_dict and AssignmentSource.from_dict and catch the datetime
issue; use the class methods AssignmentSource.to_dict and
AssignmentSource.from_dict and compare relevant attributes rather than relying
solely on dict presence checks.
This pull request was automatically created by
@coderabbitai/e2e-reviewer.Batch created pull request.
Summary by CodeRabbit
New Features
Bug Fixes