Compare commits
23 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
8b3c784e54 | ||
![]() |
0cd1b9c84d | ||
![]() |
9d08b7fcfe | ||
![]() |
5f6f988118 | ||
![]() |
1897d48188 | ||
![]() |
0207c175ba | ||
![]() |
29e42b8d80 | ||
![]() |
448a0705d2 | ||
![]() |
7ffeae1476 | ||
![]() |
7ffd4b9a91 | ||
![]() |
f49e15c05c | ||
![]() |
bbe3b3a493 | ||
![]() |
8a0457aeec | ||
![]() |
10845d2e5f | ||
![]() |
dccd7bb300 | ||
![]() |
3ec2ff1f89 | ||
![]() |
8afeda1df7 | ||
![]() |
26f589751d | ||
![]() |
81f551a21d | ||
![]() |
88c9516308 | ||
![]() |
402489c928 | ||
![]() |
f20f3c960d | ||
![]() |
fb951acb72 |
32
.cursor/rules/fastapi.mdc
Normal file
32
.cursor/rules/fastapi.mdc
Normal file
@ -0,0 +1,32 @@
|
||||
---
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
---
|
||||
# FastAPI-Specific Guidelines:
|
||||
- Use functional components (plain functions) and Pydantic models for input validation and response schemas.
|
||||
- Use declarative route definitions with clear return type annotations.
|
||||
- Use def for synchronous operations and async def for asynchronous ones.
|
||||
- Minimize @app.on_event("startup") and @app.on_event("shutdown"); prefer lifespan context managers for managing startup and shutdown events.
|
||||
- Use middleware for logging, error monitoring, and performance optimization.
|
||||
- Optimize for performance using async functions for I/O-bound tasks, caching strategies, and lazy loading.
|
||||
- Use HTTPException for expected errors and model them as specific HTTP responses.
|
||||
- Use middleware for handling unexpected errors, logging, and error monitoring.
|
||||
- Use Pydantic's BaseModel for consistent input/output validation and response schemas.
|
||||
|
||||
Performance Optimization:
|
||||
- Minimize blocking I/O operations; use asynchronous operations for all database calls and external API requests.
|
||||
- Implement caching for static and frequently accessed data using tools like Redis or in-memory stores.
|
||||
- Optimize data serialization and deserialization with Pydantic.
|
||||
- Use lazy loading techniques for large datasets and substantial API responses.
|
||||
|
||||
Key Conventions
|
||||
1. Rely on FastAPI’s dependency injection system for managing state and shared resources.
|
||||
2. Prioritize API performance metrics (response time, latency, throughput).
|
||||
3. Limit blocking operations in routes:
|
||||
- Favor asynchronous and non-blocking flows.
|
||||
- Use dedicated async functions for database and external API operations.
|
||||
- Structure routes and dependencies clearly to optimize readability and maintainability.
|
||||
|
||||
|
||||
Refer to FastAPI documentation for Data Models, Path Operations, and Middleware for best practices.
|
267
.cursor/rules/roadmap.mdc
Normal file
267
.cursor/rules/roadmap.mdc
Normal file
@ -0,0 +1,267 @@
|
||||
---
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: false
|
||||
---
|
||||
Of course. Based on a thorough review of your project's structure and code, here is a detailed, LLM-friendly task list to implement the requested features.
|
||||
|
||||
This plan is designed to be sequential and modular, focusing on backend database changes first, then backend logic and APIs, and finally the corresponding frontend implementation for each feature.
|
||||
|
||||
---
|
||||
|
||||
### **High-Level Strategy & Recommendations**
|
||||
|
||||
1. **Iterative Implementation:** Tackle one major feature at a time (e.g., complete Audit Logging, then Archiving, etc.). This keeps pull requests manageable and easier to review.
|
||||
2. **Traceability:** The request for traceability is key. We will use timestamp-based flags (`archived_at`, `deleted_at`) instead of booleans and create dedicated history/log tables for critical actions.
|
||||
|
||||
---
|
||||
|
||||
### **Phase 1: Database Schema Redesign**
|
||||
|
||||
This is the most critical first step. All subsequent tasks depend on these changes. You will need to create a new Alembic migration to apply these.
|
||||
|
||||
**File to Modify:** `be/app/models.py`
|
||||
**Action:** Create a new Alembic migration file (`alembic revision -m "feature_updates_phase1"`) and implement the following changes in `upgrade()`.
|
||||
|
||||
**1. Financial Audit Logging**
|
||||
* Create a new table to log every financial transaction and change. This ensures complete traceability.
|
||||
|
||||
```python
|
||||
# In be/app/models.py
|
||||
class FinancialAuditLog(Base):
|
||||
__tablename__ = 'financial_audit_log'
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
timestamp = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
user_id = Column(Integer, ForeignKey('users.id'), nullable=True) # User who performed the action. Nullable for system actions.
|
||||
action_type = Column(String, nullable=False, index=True) # e.g., 'EXPENSE_CREATED', 'SPLIT_PAID', 'SETTLEMENT_DELETED'
|
||||
entity_type = Column(String, nullable=False) # e.g., 'Expense', 'ExpenseSplit', 'Settlement'
|
||||
entity_id = Column(Integer, nullable=False)
|
||||
details = Column(JSONB, nullable=True) # To store 'before' and 'after' states or other relevant data.
|
||||
|
||||
user = relationship("User")
|
||||
```
|
||||
|
||||
**2. Archiving Lists and History**
|
||||
* Modify the `lists` table to support soft deletion/archiving.
|
||||
|
||||
```python
|
||||
# In be/app/models.py, class List(Base):
|
||||
# REMOVE: is_deleted = Column(Boolean, default=False, nullable=False) # If it exists
|
||||
archived_at = Column(DateTime(timezone=True), nullable=True, index=True)
|
||||
```
|
||||
|
||||
**3. Chore Subtasks**
|
||||
* Add a self-referencing foreign key to the `chores` table.
|
||||
|
||||
```python
|
||||
# In be/app/models.py, class Chore(Base):
|
||||
parent_chore_id = Column(Integer, ForeignKey('chores.id'), nullable=True, index=True)
|
||||
|
||||
# Add relationships
|
||||
parent_chore = relationship("Chore", remote_side=[id], back_populates="child_chores")
|
||||
child_chores = relationship("Chore", back_populates="parent_chore", cascade="all, delete-orphan")
|
||||
```
|
||||
|
||||
**4. List Categories**
|
||||
* Create a new `categories` table and link it to the `items` table. This allows items to be categorized.
|
||||
|
||||
```python
|
||||
# In be/app/models.py
|
||||
class Category(Base):
|
||||
__tablename__ = 'categories'
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
name = Column(String, nullable=False, index=True)
|
||||
user_id = Column(Integer, ForeignKey('users.id'), nullable=True) # Nullable for global categories
|
||||
group_id = Column(Integer, ForeignKey('groups.id'), nullable=True) # Nullable for user-specific or global
|
||||
# Add constraints to ensure either user_id or group_id is set, or both are null for global categories
|
||||
__table_args__ = (UniqueConstraint('name', 'user_id', 'group_id', name='uq_category_scope'),)
|
||||
|
||||
# In be/app/models.py, class Item(Base):
|
||||
category_id = Column(Integer, ForeignKey('categories.id'), nullable=True)
|
||||
category = relationship("Category")
|
||||
```
|
||||
|
||||
**5. Time Tracking for Chores**
|
||||
* Create a new `time_entries` table to log time spent on chore assignments.
|
||||
|
||||
```python
|
||||
# In be/app/models.py
|
||||
class TimeEntry(Base):
|
||||
__tablename__ = 'time_entries'
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
chore_assignment_id = Column(Integer, ForeignKey('chore_assignments.id', ondelete="CASCADE"), nullable=False)
|
||||
user_id = Column(Integer, ForeignKey('users.id'), nullable=False)
|
||||
start_time = Column(DateTime(timezone=True), nullable=False)
|
||||
end_time = Column(DateTime(timezone=True), nullable=True)
|
||||
duration_seconds = Column(Integer, nullable=True) # Calculated on end_time set
|
||||
|
||||
assignment = relationship("ChoreAssignment")
|
||||
user = relationship("User")
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **Phase 2: Backend Implementation**
|
||||
|
||||
For each feature, implement the necessary backend logic.
|
||||
|
||||
#### **Task 2.1: Implement Financial Audit Logging**
|
||||
|
||||
* **Goal:** Automatically log all changes to expenses, splits, and settlements.
|
||||
* **Tasks:**
|
||||
1. **CRUD (`be/app/crud/audit.py`):**
|
||||
* Create a new file `audit.py`.
|
||||
* Implement `create_financial_audit_log(db: AsyncSession, user_id: int, action_type: str, entity: Base, details: dict)`. This function will create a new log entry.
|
||||
2. **Integrate Logging:**
|
||||
* Modify `be/app/crud/expense.py`: In `create_expense`, `update_expense`, `delete_expense`, call `create_financial_audit_log`. For updates, the `details` JSONB should contain `{"before": {...}, "after": {...}}`.
|
||||
* Modify `be/app/crud/settlement.py`: Do the same for `create_settlement`, `update_settlement`, `delete_settlement`.
|
||||
* Modify `be/app/crud/settlement_activity.py`: Do the same for `create_settlement_activity`.
|
||||
3. **API (`be/app/api/v1/endpoints/history.py` - new file):**
|
||||
* Create a new endpoint `GET /history/financial/group/{group_id}` to view the audit log for a group.
|
||||
* Create a new endpoint `GET /history/financial/user/me` for a user's personal financial history.
|
||||
|
||||
#### **Task 2.2: Implement Archiving**
|
||||
|
||||
* **Goal:** Allow users to archive lists instead of permanently deleting them.
|
||||
* **Tasks:**
|
||||
1. **CRUD (`be/app/crud/list.py`):**
|
||||
* Rename `delete_list` to `archive_list`. Instead of `db.delete(list_db)`, it should set `list_db.archived_at = datetime.now(timezone.utc)`.
|
||||
* Modify `get_lists_for_user` to filter out archived lists by default: `.where(ListModel.archived_at.is_(None))`.
|
||||
2. **API (`be/app/api/v1/endpoints/lists.py`):**
|
||||
* Update the `DELETE /{list_id}` endpoint to call `archive_list`.
|
||||
* Create a new endpoint `GET /archived` to fetch archived lists for the user.
|
||||
* Create a new endpoint `POST /{list_id}/unarchive` to set `archived_at` back to `NULL`.
|
||||
|
||||
#### **Task 2.3: Implement Chore Subtasks & Unmarking Completion**
|
||||
|
||||
* **Goal:** Allow chores to have a hierarchy and for completion to be reversible.
|
||||
* **Tasks:**
|
||||
1. **Schemas (`be/app/schemas/chore.py`):**
|
||||
* Update `ChorePublic` and `ChoreCreate` schemas to include `parent_chore_id: Optional[int]` and `child_chores: List[ChorePublic] = []`.
|
||||
2. **CRUD (`be/app/crud/chore.py`):**
|
||||
* Modify `create_chore` and `update_chore` to handle the `parent_chore_id`.
|
||||
* In `update_chore_assignment`, enhance the `is_complete=False` logic. When a chore is re-opened, log it to the history. Decide on the policy for the parent chore's `next_due_date` (recommendation: do not automatically roll it back; let the user adjust it manually if needed).
|
||||
3. **API (`be/app/api/v1/endpoints/chores.py`):**
|
||||
* Update the `POST` and `PUT` endpoints for chores to accept `parent_chore_id`.
|
||||
* The `PUT /assignments/{assignment_id}` endpoint already supports setting `is_complete`. Ensure it correctly calls the updated CRUD logic.
|
||||
|
||||
#### **Task 2.4: Implement List Categories**
|
||||
|
||||
* **Goal:** Allow items to be categorized for better organization.
|
||||
* **Tasks:**
|
||||
1. **Schemas (`be/app/schemas/category.py` - new file):**
|
||||
* Create `CategoryCreate`, `CategoryUpdate`, `CategoryPublic`.
|
||||
2. **CRUD (`be/app/crud/category.py` - new file):**
|
||||
* Implement full CRUD functions for categories (`create_category`, `get_user_categories`, `update_category`, `delete_category`).
|
||||
3. **API (`be/app/api/v1/endpoints/categories.py` - new file):**
|
||||
* Create endpoints for `GET /`, `POST /`, `PUT /{id}`, `DELETE /{id}` for categories.
|
||||
4. **Item Integration:**
|
||||
* Update `ItemCreate` and `ItemUpdate` schemas in `be/app/schemas/item.py` to include `category_id: Optional[int]`.
|
||||
* Update `crud_item.create_item` and `crud_item.update_item` to handle setting the `category_id`.
|
||||
|
||||
#### **Task 2.5: Enhance OCR for Receipts**
|
||||
|
||||
* **skipped**
|
||||
|
||||
#### **Task 2.6: Implement "Continue as Guest"**
|
||||
|
||||
* **Goal:** Allow users to use the app without creating a full account.
|
||||
* **Tasks:**
|
||||
1. **DB Model (`be/app/models.py`):**
|
||||
* Add `is_guest = Column(Boolean, default=False, nullable=False)` to the `User` model.
|
||||
2. **Auth (`be/app/api/auth/guest.py` - new file):**
|
||||
* Create a new router for guest functionality.
|
||||
* Implement a `POST /auth/guest` endpoint. This endpoint will:
|
||||
* Create a new user with a unique but temporary-looking email (e.g., `guest_{uuid}@guest.mitlist.app`).
|
||||
* Set `is_guest=True`.
|
||||
* Generate and return JWT tokens for this guest user, just like a normal login.
|
||||
3. **Claim Account (`be/app/api/auth/guest.py`):**
|
||||
* Implement a `POST /auth/guest/claim` endpoint (requires auth). This endpoint will take a new email and password, update the `is_guest=False`, set the new credentials, and mark the email for verification.
|
||||
|
||||
#### **Task 2.7: Implement Redis**
|
||||
|
||||
* **Goal:** Integrate Redis for caching to improve performance.
|
||||
* **Tasks:**
|
||||
1. **Dependencies (`be/requirements.txt`):** Add `redis`.
|
||||
2. **Configuration (`be/app/config.py`):** Add `REDIS_URL` to settings.
|
||||
3. **Connection (`be/app/core/redis.py` - new file):** Create a Redis connection pool.
|
||||
4. **Caching (`be/app/core/cache.py` - new file):** Implement a simple caching decorator.
|
||||
```python
|
||||
# Example decorator
|
||||
def cache(expire_time: int = 3600):
|
||||
def decorator(func):
|
||||
@wraps(func)
|
||||
async def wrapper(*args, **kwargs):
|
||||
# ... logic to check cache, return if hit ...
|
||||
# ... if miss, call func, store result in cache ...
|
||||
return result
|
||||
return wrapper
|
||||
return decorator
|
||||
```
|
||||
5. **Apply Caching:** Apply the `@cache` decorator to read-heavy, non-volatile CRUD functions like `crud_group.get_group_by_id`.
|
||||
|
||||
---
|
||||
|
||||
### **Phase 3: Frontend Implementation**
|
||||
|
||||
Implement the UI for the new features, using your Valerie UI components.
|
||||
|
||||
#### **Task 3.1: Implement Archiving UI**
|
||||
|
||||
* **Goal:** Allow users to archive and view archived lists.
|
||||
* **Files to Modify:** `fe/src/pages/ListsPage.vue`, `fe/src/stores/listStore.ts` (if you create one).
|
||||
* **Tasks:**
|
||||
1. Change the "Delete" action on lists to "Archive".
|
||||
2. Add a toggle/filter to show archived lists.
|
||||
3. When viewing archived lists, show an "Unarchive" button.
|
||||
|
||||
#### **Task 3.2: Implement Subtasks and Unmarking UI**
|
||||
|
||||
* **Goal:** Update the chore interface for subtasks and undoing completion.
|
||||
* **Files to Modify:** `fe/src/pages/ChoresPage.vue`, `fe/src/components/ChoreItem.vue` (if it exists).
|
||||
* **Tasks:**
|
||||
1. Modify the chore list to be a nested/tree view to display parent-child relationships.
|
||||
2. Update the chore creation/edit modal to include a "Parent Chore" dropdown.
|
||||
3. On completed chores, change the "Completed" checkmark to an "Undo" button. Clicking it should call the API to set `is_complete` to `false`.
|
||||
|
||||
#### **Task 3.3: Implement Category Management and Supermarkt Mode**
|
||||
|
||||
* **Goal:** Add category features and the special "Supermarkt Mode".
|
||||
* **Files to Modify:** `fe/src/pages/ListDetailPage.vue`, `fe/src/components/Item.vue`.
|
||||
* **Tasks:**
|
||||
1. Create a new page/modal for managing categories (CRUD).
|
||||
2. In the `ListDetailPage`, add a "Category" dropdown when adding/editing an item.
|
||||
3. Display items grouped by category.
|
||||
4. **Supermarkt Mode:**
|
||||
* Add a toggle button on the `ListDetailPage` to enter "Supermarkt Mode".
|
||||
* When an item is checked, apply a temporary CSS class to other items in the same category.
|
||||
* Ensure the price input field appears next to checked items.
|
||||
* Add a `VProgressBar` at the top, with `value` bound to `completedItems.length` and `max` bound to `totalItems.length`.
|
||||
|
||||
#### **Task 3.4: Implement Time Tracking UI**
|
||||
|
||||
* **Goal:** Allow users to track time on chores.
|
||||
* **Files to Modify:** `fe/src/pages/ChoresPage.vue`.
|
||||
* **Tasks:**
|
||||
1. Add a "Start/Stop" timer button on each chore assignment.
|
||||
2. Clicking "Start" sends a `POST /time_entries` request.
|
||||
3. Clicking "Stop" sends a `PUT /time_entries/{id}` request.
|
||||
4. Display the total time spent on the chore.
|
||||
|
||||
|
||||
#### **Task 3.5: Implement Guest Flow**
|
||||
|
||||
* **Goal:** Provide a seamless entry point for new users.
|
||||
* **Files to Modify:** `fe/src/pages/LoginPage.vue`, `fe/src/stores/auth.ts`, `fe/src/router/index.ts`.
|
||||
* **Tasks:**
|
||||
1. On the `LoginPage`, add a "Continue as Guest" button.
|
||||
2. This button calls a new `authStore.loginAsGuest()` action.
|
||||
3. The action hits the `POST /auth/guest` endpoint, receives tokens, and stores them.
|
||||
4. The router logic needs adjustment to handle guest users. You might want to protect certain pages (like "Account Settings") even from guests.
|
||||
5. Add a persistent banner in the UI for guest users: "You are using a guest account. **Sign up** to save your data."
|
||||
|
||||
</file>
|
||||
```
|
||||
|
||||
|
||||
**Final Note:** This is a comprehensive roadmap. Each major task can be broken down further into smaller sub-tasks. Good luck with the implementation
|
37
.cursor/rules/vue.mdc
Normal file
37
.cursor/rules/vue.mdc
Normal file
@ -0,0 +1,37 @@
|
||||
---
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
You have extensive expertise in Vue 3, TypeScript, Node.js, Vite, Vue Router, Pinia, VueUse, and CSS. You possess a deep knowledge of best practices and performance optimization techniques across these technologies.
|
||||
|
||||
Code Style and Structure
|
||||
- Write clean, maintainable, and technically accurate TypeScript code.
|
||||
- Emphasize iteration and modularization and minimize code duplication.
|
||||
- Prefer Composition API <script setup> style.
|
||||
- Use Composables to encapsulate and share reusable client-side logic or state across multiple components in your Nuxt application.
|
||||
|
||||
Fetching Data
|
||||
1. Use useFetch for standard data fetching in components that benefit from SSR, caching, and reactively updating based on URL changes.
|
||||
2. Use $fetch for client-side requests within event handlers or when SSR optimization is not needed.
|
||||
3. Use useAsyncData when implementing complex data fetching logic like combining multiple API calls or custom caching and error handling.
|
||||
4. Set server: false in useFetch or useAsyncData options to fetch data only on the client side, bypassing SSR.
|
||||
5. Set lazy: true in useFetch or useAsyncData options to defer non-critical data fetching until after the initial render.
|
||||
|
||||
Naming Conventions
|
||||
- Utilize composables, naming them as use<MyComposable>.
|
||||
- Use **PascalCase** for component file names (e.g., components/MyComponent.vue).
|
||||
- Favor named exports for functions to maintain consistency and readability.
|
||||
|
||||
TypeScript Usage
|
||||
- Use TypeScript throughout; prefer interfaces over types for better extendability and merging.
|
||||
- Avoid enums, opting for maps for improved type safety and flexibility.
|
||||
- Use functional components with TypeScript interfaces.
|
||||
|
||||
UI and Styling.
|
||||
- Implement responsive design; use a mobile-first approach.
|
||||
|
||||
|
||||
|
||||
|
@ -0,0 +1,75 @@
|
||||
"""Add chore history and scheduling tables
|
||||
|
||||
Revision ID: 05bf96a9e18b
|
||||
Revises: 91d00c100f5b
|
||||
Create Date: 2025-06-08 00:41:10.516324
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '05bf96a9e18b'
|
||||
down_revision: Union[str, None] = '91d00c100f5b'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Upgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('chore_history',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('chore_id', sa.Integer(), nullable=True),
|
||||
sa.Column('group_id', sa.Integer(), nullable=True),
|
||||
sa.Column('event_type', sa.Enum('CREATED', 'UPDATED', 'DELETED', 'COMPLETED', 'REOPENED', 'ASSIGNED', 'UNASSIGNED', 'REASSIGNED', 'SCHEDULE_GENERATED', 'DUE_DATE_CHANGED', 'DETAILS_CHANGED', name='chorehistoryeventtypeenum'), nullable=False),
|
||||
sa.Column('event_data', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('changed_by_user_id', sa.Integer(), nullable=True),
|
||||
sa.Column('timestamp', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||
sa.ForeignKeyConstraint(['changed_by_user_id'], ['users.id'], ),
|
||||
sa.ForeignKeyConstraint(['chore_id'], ['chores.id'], ondelete='CASCADE'),
|
||||
sa.ForeignKeyConstraint(['group_id'], ['groups.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index(op.f('ix_chore_history_chore_id'), 'chore_history', ['chore_id'], unique=False)
|
||||
op.create_index(op.f('ix_chore_history_group_id'), 'chore_history', ['group_id'], unique=False)
|
||||
op.create_index(op.f('ix_chore_history_id'), 'chore_history', ['id'], unique=False)
|
||||
op.create_table('chore_assignment_history',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('assignment_id', sa.Integer(), nullable=False),
|
||||
sa.Column('event_type', sa.Enum('CREATED', 'UPDATED', 'DELETED', 'COMPLETED', 'REOPENED', 'ASSIGNED', 'UNASSIGNED', 'REASSIGNED', 'SCHEDULE_GENERATED', 'DUE_DATE_CHANGED', 'DETAILS_CHANGED', name='chorehistoryeventtypeenum'), nullable=False),
|
||||
sa.Column('event_data', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('changed_by_user_id', sa.Integer(), nullable=True),
|
||||
sa.Column('timestamp', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||
sa.ForeignKeyConstraint(['assignment_id'], ['chore_assignments.id'], ondelete='CASCADE'),
|
||||
sa.ForeignKeyConstraint(['changed_by_user_id'], ['users.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index(op.f('ix_chore_assignment_history_assignment_id'), 'chore_assignment_history', ['assignment_id'], unique=False)
|
||||
op.create_index(op.f('ix_chore_assignment_history_id'), 'chore_assignment_history', ['id'], unique=False)
|
||||
op.drop_index('ix_apscheduler_jobs_next_run_time', table_name='apscheduler_jobs')
|
||||
op.drop_table('apscheduler_jobs')
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Downgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('apscheduler_jobs',
|
||||
sa.Column('id', sa.VARCHAR(length=191), autoincrement=False, nullable=False),
|
||||
sa.Column('next_run_time', sa.DOUBLE_PRECISION(precision=53), autoincrement=False, nullable=True),
|
||||
sa.Column('job_state', postgresql.BYTEA(), autoincrement=False, nullable=False),
|
||||
sa.PrimaryKeyConstraint('id', name='apscheduler_jobs_pkey')
|
||||
)
|
||||
op.create_index('ix_apscheduler_jobs_next_run_time', 'apscheduler_jobs', ['next_run_time'], unique=False)
|
||||
op.drop_index(op.f('ix_chore_assignment_history_id'), table_name='chore_assignment_history')
|
||||
op.drop_index(op.f('ix_chore_assignment_history_assignment_id'), table_name='chore_assignment_history')
|
||||
op.drop_table('chore_assignment_history')
|
||||
op.drop_index(op.f('ix_chore_history_id'), table_name='chore_history')
|
||||
op.drop_index(op.f('ix_chore_history_group_id'), table_name='chore_history')
|
||||
op.drop_index(op.f('ix_chore_history_chore_id'), table_name='chore_history')
|
||||
op.drop_table('chore_history')
|
||||
# ### end Alembic commands ###
|
91
be/alembic/versions/bdf7427ccfa3_feature_updates_phase1.py
Normal file
91
be/alembic/versions/bdf7427ccfa3_feature_updates_phase1.py
Normal file
@ -0,0 +1,91 @@
|
||||
"""feature_updates_phase1
|
||||
|
||||
Revision ID: bdf7427ccfa3
|
||||
Revises: 05bf96a9e18b
|
||||
Create Date: 2025-06-09 18:00:11.083651
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'bdf7427ccfa3'
|
||||
down_revision: Union[str, None] = '05bf96a9e18b'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Upgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('financial_audit_log',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('timestamp', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
|
||||
sa.Column('user_id', sa.Integer(), nullable=True),
|
||||
sa.Column('action_type', sa.String(), nullable=False),
|
||||
sa.Column('entity_type', sa.String(), nullable=False),
|
||||
sa.Column('entity_id', sa.Integer(), nullable=False),
|
||||
sa.Column('details', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index(op.f('ix_financial_audit_log_action_type'), 'financial_audit_log', ['action_type'], unique=False)
|
||||
op.create_index(op.f('ix_financial_audit_log_id'), 'financial_audit_log', ['id'], unique=False)
|
||||
op.create_table('categories',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('name', sa.String(), nullable=False),
|
||||
sa.Column('user_id', sa.Integer(), nullable=True),
|
||||
sa.Column('group_id', sa.Integer(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['group_id'], ['groups.id'], ),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('name', 'user_id', 'group_id', name='uq_category_scope')
|
||||
)
|
||||
op.create_index(op.f('ix_categories_id'), 'categories', ['id'], unique=False)
|
||||
op.create_index(op.f('ix_categories_name'), 'categories', ['name'], unique=False)
|
||||
op.create_table('time_entries',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('chore_assignment_id', sa.Integer(), nullable=False),
|
||||
sa.Column('user_id', sa.Integer(), nullable=False),
|
||||
sa.Column('start_time', sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column('end_time', sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column('duration_seconds', sa.Integer(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['chore_assignment_id'], ['chore_assignments.id'], ondelete='CASCADE'),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index(op.f('ix_time_entries_id'), 'time_entries', ['id'], unique=False)
|
||||
op.add_column('chores', sa.Column('parent_chore_id', sa.Integer(), nullable=True))
|
||||
op.create_index(op.f('ix_chores_parent_chore_id'), 'chores', ['parent_chore_id'], unique=False)
|
||||
op.create_foreign_key(None, 'chores', 'chores', ['parent_chore_id'], ['id'])
|
||||
op.add_column('items', sa.Column('category_id', sa.Integer(), nullable=True))
|
||||
op.create_foreign_key(None, 'items', 'categories', ['category_id'], ['id'])
|
||||
op.add_column('lists', sa.Column('archived_at', sa.DateTime(timezone=True), nullable=True))
|
||||
op.create_index(op.f('ix_lists_archived_at'), 'lists', ['archived_at'], unique=False)
|
||||
op.add_column('users', sa.Column('is_guest', sa.Boolean(), nullable=False, server_default='f'))
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Downgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_column('users', 'is_guest')
|
||||
op.drop_index(op.f('ix_lists_archived_at'), table_name='lists')
|
||||
op.drop_column('lists', 'archived_at')
|
||||
op.drop_constraint(None, 'items', type_='foreignkey')
|
||||
op.drop_column('items', 'category_id')
|
||||
op.drop_constraint(None, 'chores', type_='foreignkey')
|
||||
op.drop_index(op.f('ix_chores_parent_chore_id'), table_name='chores')
|
||||
op.drop_column('chores', 'parent_chore_id')
|
||||
op.drop_index(op.f('ix_time_entries_id'), table_name='time_entries')
|
||||
op.drop_table('time_entries')
|
||||
op.drop_index(op.f('ix_categories_name'), table_name='categories')
|
||||
op.drop_index(op.f('ix_categories_id'), table_name='categories')
|
||||
op.drop_table('categories')
|
||||
op.drop_index(op.f('ix_financial_audit_log_id'), table_name='financial_audit_log')
|
||||
op.drop_index(op.f('ix_financial_audit_log_action_type'), table_name='financial_audit_log')
|
||||
op.drop_table('financial_audit_log')
|
||||
# ### end Alembic commands ###
|
@ -0,0 +1,51 @@
|
||||
"""add_updated_at_and_version_to_groups
|
||||
|
||||
Revision ID: c693ade3601c
|
||||
Revises: bdf7427ccfa3
|
||||
Create Date: 2025-06-09 19:22:36.244072
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'c693ade3601c'
|
||||
down_revision: Union[str, None] = 'bdf7427ccfa3'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Upgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_index('ix_apscheduler_jobs_next_run_time', table_name='apscheduler_jobs')
|
||||
op.drop_table('apscheduler_jobs')
|
||||
op.add_column('groups', sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False))
|
||||
op.add_column('groups', sa.Column('version', sa.Integer(), server_default='1', nullable=False))
|
||||
op.alter_column('users', 'is_guest',
|
||||
existing_type=sa.BOOLEAN(),
|
||||
server_default=None,
|
||||
existing_nullable=False)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Downgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.alter_column('users', 'is_guest',
|
||||
existing_type=sa.BOOLEAN(),
|
||||
server_default=sa.text('false'),
|
||||
existing_nullable=False)
|
||||
op.drop_column('groups', 'version')
|
||||
op.drop_column('groups', 'updated_at')
|
||||
op.create_table('apscheduler_jobs',
|
||||
sa.Column('id', sa.VARCHAR(length=191), autoincrement=False, nullable=False),
|
||||
sa.Column('next_run_time', sa.DOUBLE_PRECISION(precision=53), autoincrement=False, nullable=True),
|
||||
sa.Column('job_state', postgresql.BYTEA(), autoincrement=False, nullable=False),
|
||||
sa.PrimaryKeyConstraint('id', name='apscheduler_jobs_pkey')
|
||||
)
|
||||
op.create_index('ix_apscheduler_jobs_next_run_time', 'apscheduler_jobs', ['next_run_time'], unique=False)
|
||||
# ### end Alembic commands ###
|
@ -0,0 +1,101 @@
|
||||
"""implement soft deletes and remove cascades
|
||||
|
||||
Revision ID: d5f8a2e4c7b9
|
||||
Revises: c693ade3601c
|
||||
Create Date: 2024-03-20 10:00:00.000000
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'd5f8a2e4c7b9'
|
||||
down_revision = 'c693ade3601c'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
def upgrade():
|
||||
# Add soft delete columns to relevant tables
|
||||
tables_for_soft_delete = [
|
||||
'users', 'groups', 'lists', 'items', 'expenses', 'expense_splits',
|
||||
'chores', 'chore_assignments', 'recurrence_patterns', 'categories',
|
||||
'time_entries'
|
||||
]
|
||||
|
||||
for table in tables_for_soft_delete:
|
||||
op.add_column(table, sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True))
|
||||
op.add_column(table, sa.Column('is_deleted', sa.Boolean(), server_default='false', nullable=False))
|
||||
op.create_index(f'ix_{table}_deleted_at', table, ['deleted_at'])
|
||||
op.create_index(f'ix_{table}_is_deleted', table, ['is_deleted'])
|
||||
|
||||
# Remove cascade deletes from foreign keys
|
||||
# First drop existing foreign keys with cascade
|
||||
op.drop_constraint('user_groups_user_id_fkey', 'user_groups', type_='foreignkey')
|
||||
op.drop_constraint('user_groups_group_id_fkey', 'user_groups', type_='foreignkey')
|
||||
op.drop_constraint('invites_group_id_fkey', 'invites', type_='foreignkey')
|
||||
op.drop_constraint('items_list_id_fkey', 'items', type_='foreignkey')
|
||||
op.drop_constraint('expense_splits_expense_id_fkey', 'expense_splits', type_='foreignkey')
|
||||
op.drop_constraint('chores_group_id_fkey', 'chores', type_='foreignkey')
|
||||
op.drop_constraint('chore_assignments_chore_id_fkey', 'chore_assignments', type_='foreignkey')
|
||||
op.drop_constraint('chore_assignments_assigned_to_user_id_fkey', 'chore_assignments', type_='foreignkey')
|
||||
op.drop_constraint('chore_history_chore_id_fkey', 'chore_history', type_='foreignkey')
|
||||
op.drop_constraint('chore_history_group_id_fkey', 'chore_history', type_='foreignkey')
|
||||
op.drop_constraint('chore_assignment_history_assignment_id_fkey', 'chore_assignment_history', type_='foreignkey')
|
||||
op.drop_constraint('time_entries_chore_assignment_id_fkey', 'time_entries', type_='foreignkey')
|
||||
|
||||
# Recreate foreign keys without cascade delete
|
||||
op.create_foreign_key('user_groups_user_id_fkey', 'user_groups', 'users', ['user_id'], ['id'])
|
||||
op.create_foreign_key('user_groups_group_id_fkey', 'user_groups', 'groups', ['group_id'], ['id'])
|
||||
op.create_foreign_key('invites_group_id_fkey', 'invites', 'groups', ['group_id'], ['id'])
|
||||
op.create_foreign_key('items_list_id_fkey', 'items', 'lists', ['list_id'], ['id'])
|
||||
op.create_foreign_key('expense_splits_expense_id_fkey', 'expense_splits', 'expenses', ['expense_id'], ['id'])
|
||||
op.create_foreign_key('chores_group_id_fkey', 'chores', 'groups', ['group_id'], ['id'])
|
||||
op.create_foreign_key('chore_assignments_chore_id_fkey', 'chore_assignments', 'chores', ['chore_id'], ['id'])
|
||||
op.create_foreign_key('chore_assignments_assigned_to_user_id_fkey', 'chore_assignments', 'users', ['assigned_to_user_id'], ['id'])
|
||||
op.create_foreign_key('chore_history_chore_id_fkey', 'chore_history', 'chores', ['chore_id'], ['id'])
|
||||
op.create_foreign_key('chore_history_group_id_fkey', 'chore_history', 'groups', ['group_id'], ['id'])
|
||||
op.create_foreign_key('chore_assignment_history_assignment_id_fkey', 'chore_assignment_history', 'chore_assignments', ['assignment_id'], ['id'])
|
||||
op.create_foreign_key('time_entries_chore_assignment_id_fkey', 'time_entries', 'chore_assignments', ['chore_assignment_id'], ['id'])
|
||||
|
||||
def downgrade():
|
||||
# Remove soft delete columns
|
||||
tables_for_soft_delete = [
|
||||
'users', 'groups', 'lists', 'items', 'expenses', 'expense_splits',
|
||||
'chores', 'chore_assignments', 'recurrence_patterns', 'categories',
|
||||
'time_entries'
|
||||
]
|
||||
|
||||
for table in tables_for_soft_delete:
|
||||
op.drop_index(f'ix_{table}_deleted_at', table)
|
||||
op.drop_index(f'ix_{table}_is_deleted', table)
|
||||
op.drop_column(table, 'deleted_at')
|
||||
op.drop_column(table, 'is_deleted')
|
||||
|
||||
# Restore cascade deletes
|
||||
op.drop_constraint('user_groups_user_id_fkey', 'user_groups', type_='foreignkey')
|
||||
op.drop_constraint('user_groups_group_id_fkey', 'user_groups', type_='foreignkey')
|
||||
op.drop_constraint('invites_group_id_fkey', 'invites', type_='foreignkey')
|
||||
op.drop_constraint('items_list_id_fkey', 'items', type_='foreignkey')
|
||||
op.drop_constraint('expense_splits_expense_id_fkey', 'expense_splits', type_='foreignkey')
|
||||
op.drop_constraint('chores_group_id_fkey', 'chores', type_='foreignkey')
|
||||
op.drop_constraint('chore_assignments_chore_id_fkey', 'chore_assignments', type_='foreignkey')
|
||||
op.drop_constraint('chore_assignments_assigned_to_user_id_fkey', 'chore_assignments', type_='foreignkey')
|
||||
op.drop_constraint('chore_history_chore_id_fkey', 'chore_history', type_='foreignkey')
|
||||
op.drop_constraint('chore_history_group_id_fkey', 'chore_history', type_='foreignkey')
|
||||
op.drop_constraint('chore_assignment_history_assignment_id_fkey', 'chore_assignment_history', type_='foreignkey')
|
||||
op.drop_constraint('time_entries_chore_assignment_id_fkey', 'time_entries', type_='foreignkey')
|
||||
|
||||
# Recreate foreign keys with cascade delete
|
||||
op.create_foreign_key('user_groups_user_id_fkey', 'user_groups', 'users', ['user_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('user_groups_group_id_fkey', 'user_groups', 'groups', ['group_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('invites_group_id_fkey', 'invites', 'groups', ['group_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('items_list_id_fkey', 'items', 'lists', ['list_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('expense_splits_expense_id_fkey', 'expense_splits', 'expenses', ['expense_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('chores_group_id_fkey', 'chores', 'groups', ['group_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('chore_assignments_chore_id_fkey', 'chore_assignments', 'chores', ['chore_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('chore_assignments_assigned_to_user_id_fkey', 'chore_assignments', 'users', ['assigned_to_user_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('chore_history_chore_id_fkey', 'chore_history', 'chores', ['chore_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('chore_history_group_id_fkey', 'chore_history', 'groups', ['group_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('chore_assignment_history_assignment_id_fkey', 'chore_assignment_history', 'chore_assignments', ['assignment_id'], ['id'], ondelete='CASCADE')
|
||||
op.create_foreign_key('time_entries_chore_assignment_id_fkey', 'time_entries', 'chore_assignments', ['chore_assignment_id'], ['id'], ondelete='CASCADE')
|
@ -1,12 +1,5 @@
|
||||
# app/api/api_router.py
|
||||
from fastapi import APIRouter
|
||||
|
||||
from app.api.v1.api import api_router_v1 # Import the v1 router
|
||||
from app.api.v1.api import api_router_v1
|
||||
|
||||
api_router = APIRouter()
|
||||
|
||||
# Include versioned routers here, adding the /api prefix
|
||||
api_router.include_router(api_router_v1, prefix="/v1") # Mounts v1 endpoints under /api/v1/...
|
||||
|
||||
# Add other API versions later
|
||||
# e.g., api_router.include_router(api_router_v2, prefix="/v2")
|
||||
api_router.include_router(api_router_v1, prefix="/v1")
|
||||
|
65
be/app/api/auth/guest.py
Normal file
65
be/app/api/auth/guest.py
Normal file
@ -0,0 +1,65 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
import uuid
|
||||
|
||||
from app import models
|
||||
from app.schemas.user import UserCreate, UserClaim, UserPublic
|
||||
from app.schemas.auth import Token
|
||||
from app.database import get_session
|
||||
from app.auth import current_active_user, get_jwt_strategy, get_refresh_jwt_strategy
|
||||
from app.core.security import get_password_hash
|
||||
from app.crud import user as crud_user
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.post("/guest", response_model=Token)
|
||||
async def create_guest_user(db: AsyncSession = Depends(get_session)):
|
||||
"""
|
||||
Creates a new guest user.
|
||||
"""
|
||||
guest_email = f"guest_{uuid.uuid4()}@guest.mitlist.app"
|
||||
guest_password = uuid.uuid4().hex
|
||||
|
||||
user_in = UserCreate(email=guest_email, password=guest_password)
|
||||
user = await crud_user.create_user(db, user_in=user_in, is_guest=True)
|
||||
|
||||
# Use the same JWT strategy as regular login to generate both access and refresh tokens
|
||||
access_strategy = get_jwt_strategy()
|
||||
refresh_strategy = get_refresh_jwt_strategy()
|
||||
|
||||
access_token = await access_strategy.write_token(user)
|
||||
refresh_token = await refresh_strategy.write_token(user)
|
||||
|
||||
return {
|
||||
"access_token": access_token,
|
||||
"refresh_token": refresh_token,
|
||||
"token_type": "bearer"
|
||||
}
|
||||
|
||||
@router.post("/guest/claim", response_model=UserPublic)
|
||||
async def claim_guest_account(
|
||||
claim_in: UserClaim,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: models.User = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Claims a guest account, converting it to a full user.
|
||||
"""
|
||||
if not current_user.is_guest:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Not a guest account.")
|
||||
|
||||
existing_user = await crud_user.get_user_by_email(db, email=claim_in.email)
|
||||
if existing_user:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Email already registered.")
|
||||
|
||||
hashed_password = get_password_hash(claim_in.password)
|
||||
current_user.email = claim_in.email
|
||||
current_user.hashed_password = hashed_password
|
||||
current_user.is_guest = False
|
||||
current_user.is_verified = False # Require email verification
|
||||
|
||||
db.add(current_user)
|
||||
await db.commit()
|
||||
await db.refresh(current_user)
|
||||
|
||||
return current_user
|
26
be/app/api/auth/jwt.py
Normal file
26
be/app/api/auth/jwt.py
Normal file
@ -0,0 +1,26 @@
|
||||
from fastapi import APIRouter
|
||||
from app.auth import auth_backend, fastapi_users
|
||||
from app.schemas.user import UserCreate, UserPublic, UserUpdate
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
router.include_router(
|
||||
fastapi_users.get_auth_router(auth_backend),
|
||||
prefix="/jwt",
|
||||
tags=["auth"],
|
||||
)
|
||||
router.include_router(
|
||||
fastapi_users.get_register_router(UserPublic, UserCreate),
|
||||
prefix="",
|
||||
tags=["auth"],
|
||||
)
|
||||
router.include_router(
|
||||
fastapi_users.get_reset_password_router(),
|
||||
prefix="",
|
||||
tags=["auth"],
|
||||
)
|
||||
router.include_router(
|
||||
fastapi_users.get_verify_router(UserPublic),
|
||||
prefix="",
|
||||
tags=["auth"],
|
||||
)
|
@ -1,11 +1,12 @@
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
from fastapi.responses import RedirectResponse
|
||||
from fastapi import APIRouter, Depends, Request, HTTPException, status
|
||||
from fastapi.responses import RedirectResponse, JSONResponse
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select
|
||||
from app.database import get_transactional_session
|
||||
from app.models import User
|
||||
from app.auth import oauth, fastapi_users, auth_backend, get_jwt_strategy, get_refresh_jwt_strategy
|
||||
from app.auth import oauth, fastapi_users, auth_backend, get_jwt_strategy, get_refresh_jwt_strategy, get_user_manager
|
||||
from app.config import settings
|
||||
from fastapi.security import OAuth2PasswordRequestForm
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@ -18,30 +19,26 @@ async def google_callback(request: Request, db: AsyncSession = Depends(get_trans
|
||||
token_data = await oauth.google.authorize_access_token(request)
|
||||
user_info = await oauth.google.parse_id_token(request, token_data)
|
||||
|
||||
# Check if user exists
|
||||
existing_user = (await db.execute(select(User).where(User.email == user_info['email']))).scalar_one_or_none()
|
||||
|
||||
user_to_login = existing_user
|
||||
if not existing_user:
|
||||
# Create new user
|
||||
new_user = User(
|
||||
email=user_info['email'],
|
||||
name=user_info.get('name', user_info.get('email')),
|
||||
is_verified=True, # Email is verified by Google
|
||||
is_verified=True,
|
||||
is_active=True
|
||||
)
|
||||
db.add(new_user)
|
||||
await db.flush() # Use flush instead of commit since we're in a transaction
|
||||
await db.flush()
|
||||
user_to_login = new_user
|
||||
|
||||
# Generate JWT tokens using the new backend
|
||||
access_strategy = get_jwt_strategy()
|
||||
refresh_strategy = get_refresh_jwt_strategy()
|
||||
|
||||
access_token = await access_strategy.write_token(user_to_login)
|
||||
refresh_token = await refresh_strategy.write_token(user_to_login)
|
||||
|
||||
# Redirect to frontend with tokens
|
||||
redirect_url = f"{settings.FRONTEND_URL}/auth/callback?access_token={access_token}&refresh_token={refresh_token}"
|
||||
|
||||
return RedirectResponse(url=redirect_url)
|
||||
@ -61,12 +58,10 @@ async def apple_callback(request: Request, db: AsyncSession = Depends(get_transa
|
||||
if 'email' not in user_info:
|
||||
return RedirectResponse(url=f"{settings.FRONTEND_URL}/auth/callback?error=apple_email_missing")
|
||||
|
||||
# Check if user exists
|
||||
existing_user = (await db.execute(select(User).where(User.email == user_info['email']))).scalar_one_or_none()
|
||||
|
||||
user_to_login = existing_user
|
||||
if not existing_user:
|
||||
# Create new user
|
||||
name_info = user_info.get('name', {})
|
||||
first_name = name_info.get('firstName', '')
|
||||
last_name = name_info.get('lastName', '')
|
||||
@ -75,21 +70,62 @@ async def apple_callback(request: Request, db: AsyncSession = Depends(get_transa
|
||||
new_user = User(
|
||||
email=user_info['email'],
|
||||
name=full_name,
|
||||
is_verified=True, # Email is verified by Apple
|
||||
is_verified=True,
|
||||
is_active=True
|
||||
)
|
||||
db.add(new_user)
|
||||
await db.flush() # Use flush instead of commit since we're in a transaction
|
||||
await db.flush()
|
||||
user_to_login = new_user
|
||||
|
||||
# Generate JWT tokens using the new backend
|
||||
access_strategy = get_jwt_strategy()
|
||||
refresh_strategy = get_refresh_jwt_strategy()
|
||||
|
||||
access_token = await access_strategy.write_token(user_to_login)
|
||||
refresh_token = await refresh_strategy.write_token(user_to_login)
|
||||
|
||||
# Redirect to frontend with tokens
|
||||
redirect_url = f"{settings.FRONTEND_URL}/auth/callback?access_token={access_token}&refresh_token={refresh_token}"
|
||||
|
||||
return RedirectResponse(url=redirect_url)
|
||||
|
||||
@router.post('/jwt/refresh')
|
||||
async def refresh_jwt_token(
|
||||
request: Request,
|
||||
user_manager=Depends(get_user_manager),
|
||||
):
|
||||
"""Refresh the JWT access token using a valid refresh token.
|
||||
|
||||
The incoming request must provide a JSON body with a ``refresh_token`` field.
|
||||
A new access and refresh token pair will be returned when the provided refresh
|
||||
token is valid. If the token is invalid or expired, a *401* error is raised.
|
||||
"""
|
||||
|
||||
data = await request.json()
|
||||
refresh_token = data.get('refresh_token')
|
||||
|
||||
if not refresh_token:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Missing refresh token")
|
||||
|
||||
refresh_strategy = get_refresh_jwt_strategy()
|
||||
|
||||
try:
|
||||
# ``read_token`` needs a callback capable of loading the *User* from the
|
||||
# database. We therefore pass the user manager obtained via dependency
|
||||
# injection so that the strategy can hydrate the full *User* instance.
|
||||
user = await refresh_strategy.read_token(refresh_token, user_manager)
|
||||
except Exception:
|
||||
# Any error during decoding or lookup should result in an unauthorized
|
||||
# response to avoid leaking information about token validity.
|
||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid refresh token")
|
||||
|
||||
if not user:
|
||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid refresh token")
|
||||
|
||||
access_strategy = get_jwt_strategy()
|
||||
access_token = await access_strategy.write_token(user)
|
||||
new_refresh_token = await refresh_strategy.write_token(user)
|
||||
|
||||
return JSONResponse({
|
||||
"access_token": access_token,
|
||||
"refresh_token": new_refresh_token,
|
||||
"token_type": "bearer"
|
||||
})
|
@ -9,6 +9,10 @@ from app.api.v1.endpoints import ocr
|
||||
from app.api.v1.endpoints import costs
|
||||
from app.api.v1.endpoints import financials
|
||||
from app.api.v1.endpoints import chores
|
||||
from app.api.v1.endpoints import history
|
||||
from app.api.v1.endpoints import categories
|
||||
from app.api.v1.endpoints import users
|
||||
from app.api.auth import oauth, guest, jwt
|
||||
|
||||
api_router_v1 = APIRouter()
|
||||
|
||||
@ -21,5 +25,9 @@ api_router_v1.include_router(ocr.router, prefix="/ocr", tags=["OCR"])
|
||||
api_router_v1.include_router(costs.router, prefix="/costs", tags=["Costs"])
|
||||
api_router_v1.include_router(financials.router, prefix="/financials", tags=["Financials"])
|
||||
api_router_v1.include_router(chores.router, prefix="/chores", tags=["Chores"])
|
||||
# Add other v1 endpoint routers here later
|
||||
# e.g., api_router_v1.include_router(users.router, prefix="/users", tags=["Users"])
|
||||
api_router_v1.include_router(history.router, prefix="/history", tags=["History"])
|
||||
api_router_v1.include_router(categories.router, prefix="/categories", tags=["Categories"])
|
||||
api_router_v1.include_router(oauth.router, prefix="/auth", tags=["Auth"])
|
||||
api_router_v1.include_router(guest.router, prefix="/auth", tags=["Auth"])
|
||||
api_router_v1.include_router(jwt.router, prefix="/auth", tags=["Auth"])
|
||||
api_router_v1.include_router(users.router, prefix="/users", tags=["Users"])
|
||||
|
75
be/app/api/v1/endpoints/categories.py
Normal file
75
be/app/api/v1/endpoints/categories.py
Normal file
@ -0,0 +1,75 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from typing import List, Optional
|
||||
|
||||
from app import models
|
||||
from app.schemas.category import CategoryCreate, CategoryUpdate, CategoryPublic
|
||||
from app.database import get_session
|
||||
from app.auth import current_active_user
|
||||
from app.crud import category as crud_category, group as crud_group
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.post("/", response_model=CategoryPublic)
|
||||
async def create_category(
|
||||
category_in: CategoryCreate,
|
||||
group_id: Optional[int] = None,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: models.User = Depends(current_active_user),
|
||||
):
|
||||
if group_id:
|
||||
is_member = await crud_group.is_user_member(db, user_id=current_user.id, group_id=group_id)
|
||||
if not is_member:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Not a member of this group")
|
||||
|
||||
return await crud_category.create_category(db=db, category_in=category_in, user_id=current_user.id, group_id=group_id)
|
||||
|
||||
@router.get("/", response_model=List[CategoryPublic])
|
||||
async def read_categories(
|
||||
group_id: Optional[int] = None,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: models.User = Depends(current_active_user),
|
||||
):
|
||||
if group_id:
|
||||
is_member = await crud_group.is_user_member(db, user_id=current_user.id, group_id=group_id)
|
||||
if not is_member:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Not a member of this group")
|
||||
return await crud_category.get_group_categories(db=db, group_id=group_id)
|
||||
return await crud_category.get_user_categories(db=db, user_id=current_user.id)
|
||||
|
||||
@router.put("/{category_id}", response_model=CategoryPublic)
|
||||
async def update_category(
|
||||
category_id: int,
|
||||
category_in: CategoryUpdate,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: models.User = Depends(current_active_user),
|
||||
):
|
||||
db_category = await crud_category.get_category(db, category_id=category_id)
|
||||
if not db_category:
|
||||
raise HTTPException(status_code=404, detail="Category not found")
|
||||
if db_category.user_id != current_user.id:
|
||||
if not db_category.group_id:
|
||||
raise HTTPException(status_code=403, detail="Not your category")
|
||||
is_member = await crud_group.is_user_member(db, user_id=current_user.id, group_id=db_category.group_id)
|
||||
if not is_member:
|
||||
raise HTTPException(status_code=403, detail="Not a member of this group")
|
||||
|
||||
return await crud_category.update_category(db=db, db_category=db_category, category_in=category_in)
|
||||
|
||||
@router.delete("/{category_id}", response_model=CategoryPublic)
|
||||
async def delete_category(
|
||||
category_id: int,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: models.User = Depends(current_active_user),
|
||||
):
|
||||
db_category = await crud_category.get_category(db, category_id=category_id)
|
||||
if not db_category:
|
||||
raise HTTPException(status_code=404, detail="Category not found")
|
||||
if db_category.user_id != current_user.id:
|
||||
if not db_category.group_id:
|
||||
raise HTTPException(status_code=403, detail="Not your category")
|
||||
is_member = await crud_group.is_user_member(db, user_id=current_user.id, group_id=db_category.group_id)
|
||||
if not is_member:
|
||||
raise HTTPException(status_code=403, detail="Not a member of this group")
|
||||
|
||||
return await crud_category.delete_category(db=db, db_category=db_category)
|
@ -1,21 +1,38 @@
|
||||
# app/api/v1/endpoints/chores.py
|
||||
import logging
|
||||
from typing import List as PyList, Optional
|
||||
from datetime import datetime, timezone
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status, Response
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.database import get_transactional_session, get_session
|
||||
from app.auth import current_active_user
|
||||
from app.models import User as UserModel, Chore as ChoreModel, ChoreTypeEnum
|
||||
from app.schemas.chore import ChoreCreate, ChoreUpdate, ChorePublic, ChoreAssignmentCreate, ChoreAssignmentUpdate, ChoreAssignmentPublic
|
||||
from app.models import User as UserModel, Chore as ChoreModel, ChoreTypeEnum, TimeEntry
|
||||
from app.schemas.chore import (
|
||||
ChoreCreate, ChoreUpdate, ChorePublic,
|
||||
ChoreAssignmentCreate, ChoreAssignmentUpdate, ChoreAssignmentPublic,
|
||||
ChoreHistoryPublic, ChoreAssignmentHistoryPublic
|
||||
)
|
||||
from app.schemas.time_entry import TimeEntryPublic
|
||||
from app.crud import chore as crud_chore
|
||||
from app.core.exceptions import ChoreNotFoundError, PermissionDeniedError, GroupNotFoundError, DatabaseIntegrityError
|
||||
from app.crud import history as crud_history
|
||||
from app.crud import group as crud_group
|
||||
from app.core.exceptions import ChoreNotFoundError, PermissionDeniedError, GroupNotFoundError, DatabaseIntegrityError, GroupMembershipError, GroupPermissionError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
# Add this new endpoint before the personal chores section
|
||||
# --- Remove legacy duplicate chore endpoints (personal/* and groups/*/chores/*) ---
|
||||
_UNSUPPORTED_CHORE_PATHS = {
|
||||
"/personal",
|
||||
"/personal/{chore_id}",
|
||||
"/groups/{group_id}/chores",
|
||||
"/groups/{group_id}/chores/{chore_id}",
|
||||
}
|
||||
router.routes = [r for r in router.routes if getattr(r, "path", None) not in _UNSUPPORTED_CHORE_PATHS]
|
||||
|
||||
@router.get(
|
||||
"/all",
|
||||
response_model=PyList[ChorePublic],
|
||||
@ -23,13 +40,12 @@ router = APIRouter()
|
||||
tags=["Chores"]
|
||||
)
|
||||
async def list_all_chores(
|
||||
db: AsyncSession = Depends(get_session), # Use read-only session for GET
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all chores (personal and group) for the current user in a single optimized request."""
|
||||
logger.info(f"User {current_user.email} listing all their chores")
|
||||
|
||||
# Use the optimized function that reduces database queries
|
||||
all_chores = await crud_chore.get_all_user_chores(db=db, user_id=current_user.id)
|
||||
|
||||
return all_chores
|
||||
@ -116,6 +132,66 @@ async def update_personal_chore(
|
||||
logger.error(f"DatabaseIntegrityError updating personal chore {chore_id} for {current_user.email}: {e.detail}", exc_info=True)
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=e.detail)
|
||||
|
||||
@router.put(
|
||||
"/{chore_id}",
|
||||
response_model=ChorePublic,
|
||||
summary="Update Chore (Any Type)",
|
||||
tags=["Chores"]
|
||||
)
|
||||
async def update_chore_any_type(
|
||||
chore_id: int,
|
||||
chore_in: ChoreUpdate,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Updates a chore of any type, including conversions between personal and group chores."""
|
||||
logger.info(f"User {current_user.email} updating chore ID: {chore_id}")
|
||||
|
||||
# Get the current chore to determine its type and group
|
||||
current_chore = await crud_chore.get_chore_by_id(db, chore_id)
|
||||
if not current_chore:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"Chore {chore_id} not found")
|
||||
|
||||
# Check permissions on the current chore
|
||||
if current_chore.type == ChoreTypeEnum.personal:
|
||||
if current_chore.created_by_id != current_user.id:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="You can only update your own personal chores")
|
||||
else: # group chore
|
||||
if not await crud_group.is_user_member(db, current_chore.group_id, current_user.id):
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=f"You are not a member of group {current_chore.group_id}")
|
||||
|
||||
# If converting to group chore, validate the target group
|
||||
if chore_in.type == ChoreTypeEnum.group and chore_in.group_id:
|
||||
if not await crud_group.is_user_member(db, chore_in.group_id, current_user.id):
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=f"You are not a member of target group {chore_in.group_id}")
|
||||
|
||||
try:
|
||||
# Use the current group_id for the update call if not changing groups
|
||||
group_id_for_update = current_chore.group_id if current_chore.type == ChoreTypeEnum.group else None
|
||||
|
||||
updated_chore = await crud_chore.update_chore(
|
||||
db=db,
|
||||
chore_id=chore_id,
|
||||
chore_in=chore_in,
|
||||
user_id=current_user.id,
|
||||
group_id=group_id_for_update
|
||||
)
|
||||
if not updated_chore:
|
||||
raise ChoreNotFoundError(chore_id=chore_id)
|
||||
return updated_chore
|
||||
except ChoreNotFoundError as e:
|
||||
logger.warning(f"Chore {e.chore_id} not found for user {current_user.email} during update.")
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=e.detail)
|
||||
except PermissionDeniedError as e:
|
||||
logger.warning(f"Permission denied for user {current_user.email} updating chore {chore_id}: {e.detail}")
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=e.detail)
|
||||
except ValueError as e:
|
||||
logger.warning(f"ValueError updating chore {chore_id} for user {current_user.email}: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
except DatabaseIntegrityError as e:
|
||||
logger.error(f"DatabaseIntegrityError updating chore {chore_id} for {current_user.email}: {e.detail}", exc_info=True)
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=e.detail)
|
||||
|
||||
@router.delete(
|
||||
"/personal/{chore_id}",
|
||||
status_code=status.HTTP_204_NO_CONTENT,
|
||||
@ -130,14 +206,12 @@ async def delete_personal_chore(
|
||||
"""Deletes a personal chore for the current user."""
|
||||
logger.info(f"User {current_user.email} deleting personal chore ID: {chore_id}")
|
||||
try:
|
||||
# First, verify it's a personal chore belonging to the user
|
||||
chore_to_delete = await crud_chore.get_chore_by_id(db, chore_id)
|
||||
if not chore_to_delete or chore_to_delete.type != ChoreTypeEnum.personal or chore_to_delete.created_by_id != current_user.id:
|
||||
raise ChoreNotFoundError(chore_id=chore_id, detail="Personal chore not found or not owned by user.")
|
||||
|
||||
success = await crud_chore.delete_chore(db=db, chore_id=chore_id, user_id=current_user.id, group_id=None)
|
||||
if not success:
|
||||
# This case should be rare if the above check passes and DB is consistent
|
||||
raise ChoreNotFoundError(chore_id=chore_id)
|
||||
return Response(status_code=status.HTTP_204_NO_CONTENT)
|
||||
except ChoreNotFoundError as e:
|
||||
@ -151,7 +225,6 @@ async def delete_personal_chore(
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=e.detail)
|
||||
|
||||
# --- Group Chores Endpoints ---
|
||||
# (These would be similar to what you might have had before, but now explicitly part of this router)
|
||||
|
||||
@router.post(
|
||||
"/groups/{group_id}/chores",
|
||||
@ -177,7 +250,7 @@ async def create_group_chore(
|
||||
chore_payload = chore_in.model_copy(update={"group_id": group_id, "type": ChoreTypeEnum.group})
|
||||
|
||||
try:
|
||||
return await crud_chore.create_chore(db=db, chore_in=chore_payload, user_id=current_user.id, group_id=group_id)
|
||||
return await crud_chore.create_chore(db=db, chore_in=chore_payload, user_id=current_user.id)
|
||||
except GroupNotFoundError as e:
|
||||
logger.warning(f"Group {e.group_id} not found for chore creation by user {current_user.email}.")
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=e.detail)
|
||||
@ -223,18 +296,16 @@ async def update_group_chore(
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Updates a chore's details within a specific group."""
|
||||
"""Updates a chore's details. The group_id in path is the current group of the chore."""
|
||||
logger.info(f"User {current_user.email} updating chore ID {chore_id} in group {group_id}")
|
||||
if chore_in.type is not None and chore_in.type != ChoreTypeEnum.group:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Cannot change chore type to personal via this endpoint.")
|
||||
if chore_in.group_id is not None and chore_in.group_id != group_id:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=f"Chore's group_id if provided must match path group_id ({group_id}).")
|
||||
|
||||
# Ensure chore_in has the correct type for the CRUD operation
|
||||
chore_payload = chore_in.model_copy(update={"type": ChoreTypeEnum.group, "group_id": group_id} if chore_in.type is None else {"group_id": group_id})
|
||||
# Validate that the chore is in the specified group
|
||||
chore_to_update = await crud_chore.get_chore_by_id_and_group(db, chore_id, group_id, current_user.id)
|
||||
if not chore_to_update:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"Chore {chore_id} not found in group {group_id}")
|
||||
|
||||
try:
|
||||
updated_chore = await crud_chore.update_chore(db=db, chore_id=chore_id, chore_in=chore_payload, user_id=current_user.id, group_id=group_id)
|
||||
updated_chore = await crud_chore.update_chore(db=db, chore_id=chore_id, chore_in=chore_in, user_id=current_user.id, group_id=group_id)
|
||||
if not updated_chore:
|
||||
raise ChoreNotFoundError(chore_id=chore_id, group_id=group_id)
|
||||
return updated_chore
|
||||
@ -266,15 +337,12 @@ async def delete_group_chore(
|
||||
"""Deletes a chore from a group, ensuring user has permission."""
|
||||
logger.info(f"User {current_user.email} deleting chore ID {chore_id} from group {group_id}")
|
||||
try:
|
||||
# Verify chore exists and belongs to the group before attempting deletion via CRUD
|
||||
# This gives a more precise error if the chore exists but isn't in this group.
|
||||
chore_to_delete = await crud_chore.get_chore_by_id_and_group(db, chore_id, group_id, current_user.id) # checks permission too
|
||||
if not chore_to_delete : # get_chore_by_id_and_group will raise PermissionDeniedError if user not member
|
||||
raise ChoreNotFoundError(chore_id=chore_id, group_id=group_id)
|
||||
|
||||
success = await crud_chore.delete_chore(db=db, chore_id=chore_id, user_id=current_user.id, group_id=group_id)
|
||||
if not success:
|
||||
# This case should be rare if the above check passes and DB is consistent
|
||||
raise ChoreNotFoundError(chore_id=chore_id, group_id=group_id)
|
||||
return Response(status_code=status.HTTP_204_NO_CONTENT)
|
||||
except ChoreNotFoundError as e:
|
||||
@ -326,7 +394,7 @@ async def create_chore_assignment(
|
||||
)
|
||||
async def list_my_assignments(
|
||||
include_completed: bool = False,
|
||||
db: AsyncSession = Depends(get_session), # Use read-only session for GET
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all chore assignments for the current user."""
|
||||
@ -338,14 +406,14 @@ async def list_my_assignments(
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Failed to retrieve assignments")
|
||||
|
||||
@router.get(
|
||||
"/chores/{chore_id}/assignments",
|
||||
"/{chore_id}/assignments",
|
||||
response_model=PyList[ChoreAssignmentPublic],
|
||||
summary="List Chore Assignments",
|
||||
tags=["Chore Assignments"]
|
||||
)
|
||||
async def list_chore_assignments(
|
||||
chore_id: int,
|
||||
db: AsyncSession = Depends(get_session), # Use read-only session for GET
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all assignments for a specific chore."""
|
||||
@ -451,3 +519,268 @@ async def complete_chore_assignment(
|
||||
except DatabaseIntegrityError as e:
|
||||
logger.error(f"DatabaseIntegrityError completing assignment {assignment_id} for {current_user.email}: {e.detail}", exc_info=True)
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=e.detail)
|
||||
|
||||
# === CHORE HISTORY ENDPOINTS ===
|
||||
|
||||
@router.get(
|
||||
"/{chore_id}/history",
|
||||
response_model=PyList[ChoreHistoryPublic],
|
||||
summary="Get Chore History",
|
||||
tags=["Chores", "History"]
|
||||
)
|
||||
async def get_chore_history(
|
||||
chore_id: int,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves the history of a specific chore."""
|
||||
chore = await crud_chore.get_chore_by_id(db, chore_id)
|
||||
if not chore:
|
||||
raise ChoreNotFoundError(chore_id=chore_id)
|
||||
|
||||
if chore.type == ChoreTypeEnum.personal and chore.created_by_id != current_user.id:
|
||||
raise PermissionDeniedError("You can only view history for your own personal chores.")
|
||||
|
||||
if chore.type == ChoreTypeEnum.group:
|
||||
is_member = await crud_chore.is_user_member(db, chore.group_id, current_user.id)
|
||||
if not is_member:
|
||||
raise PermissionDeniedError("You must be a member of the group to view this chore's history.")
|
||||
|
||||
logger.info(f"User {current_user.email} getting history for chore {chore_id}")
|
||||
return await crud_history.get_chore_history(db=db, chore_id=chore_id)
|
||||
|
||||
@router.get(
|
||||
"/assignments/{assignment_id}/history",
|
||||
response_model=PyList[ChoreAssignmentHistoryPublic],
|
||||
summary="Get Chore Assignment History",
|
||||
tags=["Chore Assignments", "History"]
|
||||
)
|
||||
async def get_chore_assignment_history(
|
||||
assignment_id: int,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves the history of a specific chore assignment."""
|
||||
assignment = await crud_chore.get_chore_assignment_by_id(db, assignment_id)
|
||||
if not assignment:
|
||||
raise ChoreNotFoundError(assignment_id=assignment_id)
|
||||
|
||||
chore = await crud_chore.get_chore_by_id(db, assignment.chore_id)
|
||||
if not chore:
|
||||
raise ChoreNotFoundError(chore_id=assignment.chore_id)
|
||||
|
||||
if chore.type == ChoreTypeEnum.personal and chore.created_by_id != current_user.id:
|
||||
raise PermissionDeniedError("You can only view history for assignments of your own personal chores.")
|
||||
|
||||
if chore.type == ChoreTypeEnum.group:
|
||||
is_member = await crud_chore.is_user_member(db, chore.group_id, current_user.id)
|
||||
if not is_member:
|
||||
raise PermissionDeniedError("You must be a member of the group to view this assignment's history.")
|
||||
|
||||
logger.info(f"User {current_user.email} getting history for assignment {assignment_id}")
|
||||
return await crud_history.get_assignment_history(db=db, assignment_id=assignment_id)
|
||||
|
||||
# === TIME ENTRY ENDPOINTS ===
|
||||
|
||||
@router.get(
|
||||
"/assignments/{assignment_id}/time-entries",
|
||||
response_model=PyList[TimeEntryPublic],
|
||||
summary="Get Time Entries",
|
||||
tags=["Time Tracking"]
|
||||
)
|
||||
async def get_time_entries_for_assignment(
|
||||
assignment_id: int,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all time entries for a specific chore assignment."""
|
||||
assignment = await crud_chore.get_chore_assignment_by_id(db, assignment_id)
|
||||
if not assignment:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Assignment not found")
|
||||
|
||||
chore = await crud_chore.get_chore_by_id(db, assignment.chore_id)
|
||||
if not chore:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Chore not found")
|
||||
|
||||
# Permission check
|
||||
if chore.type == ChoreTypeEnum.personal and chore.created_by_id != current_user.id:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Permission denied")
|
||||
|
||||
if chore.type == ChoreTypeEnum.group:
|
||||
is_member = await crud_chore.is_user_member(db, chore.group_id, current_user.id)
|
||||
if not is_member:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Permission denied")
|
||||
|
||||
# For now, return time entries for the current user only
|
||||
time_entries = await db.execute(
|
||||
select(TimeEntry)
|
||||
.where(TimeEntry.chore_assignment_id == assignment_id)
|
||||
.where(TimeEntry.user_id == current_user.id)
|
||||
.order_by(TimeEntry.start_time.desc())
|
||||
)
|
||||
return time_entries.scalars().all()
|
||||
|
||||
@router.post(
|
||||
"/assignments/{assignment_id}/time-entries",
|
||||
response_model=TimeEntryPublic,
|
||||
status_code=status.HTTP_201_CREATED,
|
||||
summary="Start Time Entry",
|
||||
tags=["Time Tracking"]
|
||||
)
|
||||
async def start_time_entry(
|
||||
assignment_id: int,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Starts a new time entry for a chore assignment."""
|
||||
assignment = await crud_chore.get_chore_assignment_by_id(db, assignment_id)
|
||||
if not assignment:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Assignment not found")
|
||||
|
||||
chore = await crud_chore.get_chore_by_id(db, assignment.chore_id)
|
||||
if not chore:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Chore not found")
|
||||
|
||||
# Permission check - only assigned user can track time
|
||||
if assignment.assigned_to_user_id != current_user.id:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Only assigned user can track time")
|
||||
|
||||
# Check if there's already an active time entry
|
||||
existing_active = await db.execute(
|
||||
select(TimeEntry)
|
||||
.where(TimeEntry.chore_assignment_id == assignment_id)
|
||||
.where(TimeEntry.user_id == current_user.id)
|
||||
.where(TimeEntry.end_time.is_(None))
|
||||
)
|
||||
if existing_active.scalar_one_or_none():
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Time entry already active")
|
||||
|
||||
# Create new time entry
|
||||
time_entry = TimeEntry(
|
||||
chore_assignment_id=assignment_id,
|
||||
user_id=current_user.id,
|
||||
start_time=datetime.now(timezone.utc)
|
||||
)
|
||||
db.add(time_entry)
|
||||
await db.commit()
|
||||
await db.refresh(time_entry)
|
||||
|
||||
return time_entry
|
||||
|
||||
@router.put(
|
||||
"/time-entries/{time_entry_id}",
|
||||
response_model=TimeEntryPublic,
|
||||
summary="Stop Time Entry",
|
||||
tags=["Time Tracking"]
|
||||
)
|
||||
async def stop_time_entry(
|
||||
time_entry_id: int,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Stops an active time entry."""
|
||||
time_entry = await db.get(TimeEntry, time_entry_id)
|
||||
if not time_entry:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Time entry not found")
|
||||
|
||||
if time_entry.user_id != current_user.id:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Permission denied")
|
||||
|
||||
if time_entry.end_time:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Time entry already stopped")
|
||||
|
||||
# Stop the time entry
|
||||
end_time = datetime.now(timezone.utc)
|
||||
time_entry.end_time = end_time
|
||||
time_entry.duration_seconds = int((end_time - time_entry.start_time).total_seconds())
|
||||
|
||||
await db.commit()
|
||||
await db.refresh(time_entry)
|
||||
|
||||
return time_entry
|
||||
|
||||
@router.post(
|
||||
"",
|
||||
response_model=ChorePublic,
|
||||
status_code=status.HTTP_201_CREATED,
|
||||
summary="Create Chore (Any Type)",
|
||||
tags=["Chores"],
|
||||
)
|
||||
async def create_chore_any_type(
|
||||
chore_in: ChoreCreate,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Create either a personal or group chore using a single endpoint."""
|
||||
logger.info(f"User {current_user.email} creating chore (type={chore_in.type}) name={chore_in.name}")
|
||||
|
||||
# Basic permission & validation
|
||||
if chore_in.type == ChoreTypeEnum.personal:
|
||||
if chore_in.group_id is not None:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="group_id must be null for personal chores")
|
||||
elif chore_in.type == ChoreTypeEnum.group:
|
||||
if chore_in.group_id is None:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="group_id is required for group chores")
|
||||
# ensure membership
|
||||
try:
|
||||
await crud_group.check_group_membership(db, group_id=chore_in.group_id, user_id=current_user.id, action="create chores for")
|
||||
except (GroupMembershipError, GroupNotFoundError) as e:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=str(e))
|
||||
else:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Invalid chore type")
|
||||
|
||||
try:
|
||||
created = await crud_chore.create_chore(db=db, chore_in=chore_in, user_id=current_user.id)
|
||||
return created
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating chore: {e}", exc_info=True)
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
|
||||
@router.delete(
|
||||
"/{chore_id}",
|
||||
status_code=status.HTTP_204_NO_CONTENT,
|
||||
summary="Delete Chore (Any Type)",
|
||||
tags=["Chores"],
|
||||
)
|
||||
async def delete_chore_any_type(
|
||||
chore_id: int,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Delete a personal or group chore based on permissions."""
|
||||
chore = await crud_chore.get_chore_by_id(db, chore_id)
|
||||
if not chore:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Chore not found")
|
||||
|
||||
if chore.type == ChoreTypeEnum.personal:
|
||||
if chore.created_by_id != current_user.id:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="You can only delete your own personal chores")
|
||||
allowed = True
|
||||
target_group_id = None
|
||||
else:
|
||||
target_group_id = chore.group_id
|
||||
try:
|
||||
await crud_group.check_user_role_in_group(db, group_id=target_group_id, user_id=current_user.id, required_role=UserRoleEnum.owner, action="delete chore in group")
|
||||
allowed = True
|
||||
except GroupPermissionError:
|
||||
# fallback: creator may delete their own group chore
|
||||
allowed = (chore.created_by_id == current_user.id)
|
||||
except (GroupMembershipError, GroupNotFoundError):
|
||||
allowed = False
|
||||
|
||||
if not allowed:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Not authorized to delete this chore")
|
||||
|
||||
try:
|
||||
success = await crud_chore.delete_chore(db=db, chore_id=chore_id, user_id=current_user.id, group_id=target_group_id)
|
||||
if not success:
|
||||
raise ChoreNotFoundError(chore_id=chore_id)
|
||||
except ChoreNotFoundError:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Chore not found")
|
||||
except PermissionDeniedError as e:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=e.detail)
|
||||
except Exception as e:
|
||||
logger.error(f"Error deleting chore {chore_id}: {e}", exc_info=True)
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR)
|
||||
|
||||
return Response(status_code=status.HTTP_204_NO_CONTENT)
|
@ -1,113 +1,24 @@
|
||||
# app/api/v1/endpoints/costs.py
|
||||
import logging
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import Session, selectinload
|
||||
from decimal import Decimal, ROUND_HALF_UP, ROUND_DOWN
|
||||
from typing import List
|
||||
|
||||
from app.database import get_transactional_session
|
||||
from app.auth import current_active_user
|
||||
from app.models import (
|
||||
User as UserModel,
|
||||
Group as GroupModel,
|
||||
List as ListModel,
|
||||
Expense as ExpenseModel,
|
||||
Item as ItemModel,
|
||||
UserGroup as UserGroupModel,
|
||||
SplitTypeEnum,
|
||||
ExpenseSplit as ExpenseSplitModel,
|
||||
Settlement as SettlementModel,
|
||||
SettlementActivity as SettlementActivityModel # Added
|
||||
from app.models import User as UserModel, Expense as ExpenseModel
|
||||
from app.schemas.cost import ListCostSummary, GroupBalanceSummary
|
||||
from app.schemas.expense import ExpensePublic
|
||||
from app.services import costs_service
|
||||
from app.core.exceptions import (
|
||||
ListNotFoundError,
|
||||
ListPermissionError,
|
||||
GroupNotFoundError,
|
||||
GroupPermissionError,
|
||||
InvalidOperationError
|
||||
)
|
||||
from app.schemas.cost import ListCostSummary, GroupBalanceSummary, UserCostShare, UserBalanceDetail, SuggestedSettlement
|
||||
from app.schemas.expense import ExpenseCreate
|
||||
from app.crud import list as crud_list
|
||||
from app.crud import expense as crud_expense
|
||||
from app.core.exceptions import ListNotFoundError, ListPermissionError, UserNotFoundError, GroupNotFoundError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
def calculate_suggested_settlements(user_balances: List[UserBalanceDetail]) -> List[SuggestedSettlement]:
|
||||
"""
|
||||
Calculate suggested settlements to balance the finances within a group.
|
||||
|
||||
This function takes the current balances of all users and suggests optimal settlements
|
||||
to minimize the number of transactions needed to settle all debts.
|
||||
|
||||
Args:
|
||||
user_balances: List of UserBalanceDetail objects with their current balances
|
||||
|
||||
Returns:
|
||||
List of SuggestedSettlement objects representing the suggested payments
|
||||
"""
|
||||
# Create list of users who owe money (negative balance) and who are owed money (positive balance)
|
||||
debtors = [] # Users who owe money (negative balance)
|
||||
creditors = [] # Users who are owed money (positive balance)
|
||||
|
||||
# Threshold to consider a balance as zero due to floating point precision
|
||||
epsilon = Decimal('0.01')
|
||||
|
||||
# Sort users into debtors and creditors
|
||||
for user in user_balances:
|
||||
# Skip users with zero balance (or very close to zero)
|
||||
if abs(user.net_balance) < epsilon:
|
||||
continue
|
||||
|
||||
if user.net_balance < Decimal('0'):
|
||||
# User owes money
|
||||
debtors.append({
|
||||
'user_id': user.user_id,
|
||||
'user_identifier': user.user_identifier,
|
||||
'amount': -user.net_balance # Convert to positive amount
|
||||
})
|
||||
else:
|
||||
# User is owed money
|
||||
creditors.append({
|
||||
'user_id': user.user_id,
|
||||
'user_identifier': user.user_identifier,
|
||||
'amount': user.net_balance
|
||||
})
|
||||
|
||||
# Sort by amount (descending) to handle largest debts first
|
||||
debtors.sort(key=lambda x: x['amount'], reverse=True)
|
||||
creditors.sort(key=lambda x: x['amount'], reverse=True)
|
||||
|
||||
settlements = []
|
||||
|
||||
# Iterate through debtors and match them with creditors
|
||||
while debtors and creditors:
|
||||
debtor = debtors[0]
|
||||
creditor = creditors[0]
|
||||
|
||||
# Determine the settlement amount (the smaller of the two amounts)
|
||||
amount = min(debtor['amount'], creditor['amount']).quantize(Decimal('0.01'), rounding=ROUND_HALF_UP)
|
||||
|
||||
# Create settlement record
|
||||
if amount > Decimal('0'):
|
||||
settlements.append(
|
||||
SuggestedSettlement(
|
||||
from_user_id=debtor['user_id'],
|
||||
from_user_identifier=debtor['user_identifier'],
|
||||
to_user_id=creditor['user_id'],
|
||||
to_user_identifier=creditor['user_identifier'],
|
||||
amount=amount
|
||||
)
|
||||
)
|
||||
|
||||
# Update balances
|
||||
debtor['amount'] -= amount
|
||||
creditor['amount'] -= amount
|
||||
|
||||
# Remove users who have settled their debts/credits
|
||||
if debtor['amount'] < epsilon:
|
||||
debtors.pop(0)
|
||||
if creditor['amount'] < epsilon:
|
||||
creditors.pop(0)
|
||||
|
||||
return settlements
|
||||
|
||||
@router.get(
|
||||
"/lists/{list_id}/cost-summary",
|
||||
@ -116,8 +27,8 @@ def calculate_suggested_settlements(user_balances: List[UserBalanceDetail]) -> L
|
||||
tags=["Costs"],
|
||||
responses={
|
||||
status.HTTP_403_FORBIDDEN: {"description": "User does not have permission to access this list"},
|
||||
status.HTTP_404_NOT_FOUND: {"description": "List or associated user not found"}
|
||||
}
|
||||
status.HTTP_404_NOT_FOUND: {"description": "List not found"},
|
||||
},
|
||||
)
|
||||
async def get_list_cost_summary(
|
||||
list_id: int,
|
||||
@ -125,151 +36,62 @@ async def get_list_cost_summary(
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Retrieves a calculated cost summary for a specific list, detailing total costs,
|
||||
equal shares per user, and individual user balances based on their contributions.
|
||||
|
||||
The user must have access to the list to view its cost summary.
|
||||
Costs are split among group members if the list belongs to a group, or just for
|
||||
the creator if it's a personal list. All users who added items with prices are
|
||||
included in the calculation.
|
||||
Retrieves a calculated cost summary for a specific list.
|
||||
If an expense has been generated for this list, the summary will be based on that.
|
||||
Otherwise, it will be a basic summary of item prices.
|
||||
This endpoint is idempotent and does not create any data.
|
||||
"""
|
||||
logger.info(f"User {current_user.email} requesting cost summary for list {list_id}")
|
||||
|
||||
# 1. Verify user has access to the target list
|
||||
try:
|
||||
await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id)
|
||||
return await costs_service.get_list_cost_summary_logic(
|
||||
db=db, list_id=list_id, current_user_id=current_user.id
|
||||
)
|
||||
except ListPermissionError as e:
|
||||
logger.warning(f"Permission denied for user {current_user.email} on list {list_id}: {str(e)}")
|
||||
raise
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=str(e))
|
||||
except ListNotFoundError as e:
|
||||
logger.warning(f"List {list_id} not found when checking permissions for cost summary: {str(e)}")
|
||||
raise
|
||||
logger.warning(f"List {list_id} not found when getting cost summary: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
|
||||
|
||||
# 2. Get the list with its items and users
|
||||
list_result = await db.execute(
|
||||
select(ListModel)
|
||||
.options(
|
||||
selectinload(ListModel.items).options(selectinload(ItemModel.added_by_user)),
|
||||
selectinload(ListModel.group).options(selectinload(GroupModel.member_associations).options(selectinload(UserGroupModel.user))),
|
||||
selectinload(ListModel.creator)
|
||||
|
||||
@router.post(
|
||||
"/lists/{list_id}/cost-summary",
|
||||
response_model=ExpensePublic,
|
||||
status_code=status.HTTP_201_CREATED,
|
||||
summary="Generate and Get Expense from List Summary",
|
||||
tags=["Costs"],
|
||||
responses={
|
||||
status.HTTP_403_FORBIDDEN: {"description": "User does not have permission to access this list"},
|
||||
status.HTTP_404_NOT_FOUND: {"description": "List not found"},
|
||||
status.HTTP_400_BAD_REQUEST: {"description": "Invalid operation (e.g., no items to expense, or expense already exists)"},
|
||||
},
|
||||
)
|
||||
async def generate_expense_from_list_summary(
|
||||
list_id: int,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Creates an ITEM_BASED expense from the items in a given list.
|
||||
This should be called to finalize the costs for a shopping list and turn it into a formal expense.
|
||||
It will fail if an expense for this list already exists.
|
||||
"""
|
||||
logger.info(f"User {current_user.email} requesting to generate expense from list {list_id}")
|
||||
try:
|
||||
expense = await costs_service.generate_expense_from_list_logic(
|
||||
db=db, list_id=list_id, current_user_id=current_user.id
|
||||
)
|
||||
.where(ListModel.id == list_id)
|
||||
)
|
||||
db_list = list_result.scalars().first()
|
||||
if not db_list:
|
||||
raise ListNotFoundError(list_id)
|
||||
return expense
|
||||
except (ListPermissionError, GroupPermissionError) as e:
|
||||
logger.warning(f"Permission denied for user {current_user.email} on list {list_id}: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=str(e))
|
||||
except (ListNotFoundError, GroupNotFoundError) as e:
|
||||
logger.warning(f"Resource not found for list {list_id}: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
|
||||
except InvalidOperationError as e:
|
||||
logger.warning(f"Invalid operation for list {list_id}: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
|
||||
# 3. Get or create an expense for this list
|
||||
expense_result = await db.execute(
|
||||
select(ExpenseModel)
|
||||
.where(ExpenseModel.list_id == list_id)
|
||||
.options(selectinload(ExpenseModel.splits))
|
||||
)
|
||||
db_expense = expense_result.scalars().first()
|
||||
|
||||
if not db_expense:
|
||||
# Create a new expense for this list
|
||||
total_amount = sum(item.price for item in db_list.items if item.price is not None and item.price > Decimal("0"))
|
||||
if total_amount == Decimal("0"):
|
||||
return ListCostSummary(
|
||||
list_id=db_list.id,
|
||||
list_name=db_list.name,
|
||||
total_list_cost=Decimal("0.00"),
|
||||
num_participating_users=0,
|
||||
equal_share_per_user=Decimal("0.00"),
|
||||
user_balances=[]
|
||||
)
|
||||
|
||||
# Create expense with ITEM_BASED split type
|
||||
expense_in = ExpenseCreate(
|
||||
description=f"Cost summary for list {db_list.name}",
|
||||
total_amount=total_amount,
|
||||
list_id=list_id,
|
||||
split_type=SplitTypeEnum.ITEM_BASED,
|
||||
paid_by_user_id=db_list.creator.id
|
||||
)
|
||||
db_expense = await crud_expense.create_expense(db=db, expense_in=expense_in, current_user_id=current_user.id)
|
||||
|
||||
# 4. Calculate cost summary from expense splits
|
||||
participating_users = set()
|
||||
user_items_added_value = {}
|
||||
total_list_cost = Decimal("0.00")
|
||||
|
||||
# Get all users who added items
|
||||
for item in db_list.items:
|
||||
if item.price is not None and item.price > Decimal("0") and item.added_by_user:
|
||||
participating_users.add(item.added_by_user)
|
||||
user_items_added_value[item.added_by_user.id] = user_items_added_value.get(item.added_by_user.id, Decimal("0.00")) + item.price
|
||||
total_list_cost += item.price
|
||||
|
||||
# Get all users from expense splits
|
||||
for split in db_expense.splits:
|
||||
if split.user:
|
||||
participating_users.add(split.user)
|
||||
|
||||
num_participating_users = len(participating_users)
|
||||
if num_participating_users == 0:
|
||||
return ListCostSummary(
|
||||
list_id=db_list.id,
|
||||
list_name=db_list.name,
|
||||
total_list_cost=Decimal("0.00"),
|
||||
num_participating_users=0,
|
||||
equal_share_per_user=Decimal("0.00"),
|
||||
user_balances=[]
|
||||
)
|
||||
|
||||
# This is the ideal equal share, returned in the summary
|
||||
equal_share_per_user_for_response = (total_list_cost / Decimal(num_participating_users)).quantize(Decimal("0.01"), rounding=ROUND_HALF_UP)
|
||||
|
||||
# Sort users for deterministic remainder distribution
|
||||
sorted_participating_users = sorted(list(participating_users), key=lambda u: u.id)
|
||||
|
||||
user_final_shares = {}
|
||||
if num_participating_users > 0:
|
||||
base_share_unrounded = total_list_cost / Decimal(num_participating_users)
|
||||
|
||||
# Calculate initial share for each user, rounding down
|
||||
for user in sorted_participating_users:
|
||||
user_final_shares[user.id] = base_share_unrounded.quantize(Decimal("0.01"), rounding=ROUND_DOWN)
|
||||
|
||||
# Calculate sum of rounded down shares
|
||||
sum_of_rounded_shares = sum(user_final_shares.values())
|
||||
|
||||
# Calculate remaining pennies to be distributed
|
||||
remaining_pennies = int(((total_list_cost - sum_of_rounded_shares) * Decimal("100")).to_integral_value(rounding=ROUND_HALF_UP))
|
||||
|
||||
# Distribute remaining pennies one by one to sorted users
|
||||
for i in range(remaining_pennies):
|
||||
user_to_adjust = sorted_participating_users[i % num_participating_users]
|
||||
user_final_shares[user_to_adjust.id] += Decimal("0.01")
|
||||
|
||||
user_balances = []
|
||||
for user in sorted_participating_users: # Iterate over sorted users
|
||||
items_added = user_items_added_value.get(user.id, Decimal("0.00"))
|
||||
# current_user_share is now the precisely calculated share for this user
|
||||
current_user_share = user_final_shares.get(user.id, Decimal("0.00"))
|
||||
|
||||
balance = items_added - current_user_share
|
||||
user_identifier = user.name if user.name else user.email
|
||||
user_balances.append(
|
||||
UserCostShare(
|
||||
user_id=user.id,
|
||||
user_identifier=user_identifier,
|
||||
items_added_value=items_added.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
amount_due=current_user_share.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
balance=balance.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP)
|
||||
)
|
||||
)
|
||||
|
||||
user_balances.sort(key=lambda x: x.user_identifier)
|
||||
return ListCostSummary(
|
||||
list_id=db_list.id,
|
||||
list_name=db_list.name,
|
||||
total_list_cost=total_list_cost.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
num_participating_users=num_participating_users,
|
||||
equal_share_per_user=equal_share_per_user_for_response, # Use the ideal share for the response field
|
||||
user_balances=user_balances
|
||||
)
|
||||
|
||||
@router.get(
|
||||
"/groups/{group_id}/balance-summary",
|
||||
@ -278,8 +100,8 @@ async def get_list_cost_summary(
|
||||
tags=["Costs", "Groups"],
|
||||
responses={
|
||||
status.HTTP_403_FORBIDDEN: {"description": "User does not have permission to access this group"},
|
||||
status.HTTP_404_NOT_FOUND: {"description": "Group not found"}
|
||||
}
|
||||
status.HTTP_404_NOT_FOUND: {"description": "Group not found"},
|
||||
},
|
||||
)
|
||||
async def get_group_balance_summary(
|
||||
group_id: int,
|
||||
@ -292,132 +114,13 @@ async def get_group_balance_summary(
|
||||
The user must be a member of the group to view its balance summary.
|
||||
"""
|
||||
logger.info(f"User {current_user.email} requesting balance summary for group {group_id}")
|
||||
|
||||
# 1. Verify user is a member of the target group
|
||||
group_check = await db.execute(
|
||||
select(GroupModel)
|
||||
.options(selectinload(GroupModel.member_associations))
|
||||
.where(GroupModel.id == group_id)
|
||||
)
|
||||
db_group_for_check = group_check.scalars().first()
|
||||
|
||||
if not db_group_for_check:
|
||||
raise GroupNotFoundError(group_id)
|
||||
|
||||
user_is_member = any(assoc.user_id == current_user.id for assoc in db_group_for_check.member_associations)
|
||||
if not user_is_member:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=f"User not a member of group {group_id}")
|
||||
|
||||
# 2. Get all expenses and settlements for the group
|
||||
expenses_result = await db.execute(
|
||||
select(ExpenseModel)
|
||||
.where(ExpenseModel.group_id == group_id)
|
||||
.options(selectinload(ExpenseModel.splits).selectinload(ExpenseSplitModel.user))
|
||||
)
|
||||
expenses = expenses_result.scalars().all()
|
||||
|
||||
settlements_result = await db.execute(
|
||||
select(SettlementModel)
|
||||
.where(SettlementModel.group_id == group_id)
|
||||
.options(
|
||||
selectinload(SettlementModel.paid_by_user),
|
||||
selectinload(SettlementModel.paid_to_user)
|
||||
)
|
||||
)
|
||||
settlements = settlements_result.scalars().all()
|
||||
|
||||
# Fetch SettlementActivities related to the group's expenses
|
||||
# This requires joining SettlementActivity -> ExpenseSplit -> Expense
|
||||
settlement_activities_result = await db.execute(
|
||||
select(SettlementActivityModel)
|
||||
.join(ExpenseSplitModel, SettlementActivityModel.expense_split_id == ExpenseSplitModel.id)
|
||||
.join(ExpenseModel, ExpenseSplitModel.expense_id == ExpenseModel.id)
|
||||
.where(ExpenseModel.group_id == group_id)
|
||||
.options(selectinload(SettlementActivityModel.payer)) # Optional: if you need payer details directly
|
||||
)
|
||||
settlement_activities = settlement_activities_result.scalars().all()
|
||||
|
||||
# 3. Calculate user balances
|
||||
user_balances_data = {}
|
||||
# Initialize UserBalanceDetail for each group member
|
||||
for assoc in db_group_for_check.member_associations:
|
||||
if assoc.user:
|
||||
user_balances_data[assoc.user.id] = {
|
||||
"user_id": assoc.user.id,
|
||||
"user_identifier": assoc.user.name if assoc.user.name else assoc.user.email,
|
||||
"total_paid_for_expenses": Decimal("0.00"),
|
||||
"initial_total_share_of_expenses": Decimal("0.00"),
|
||||
"total_amount_paid_via_settlement_activities": Decimal("0.00"),
|
||||
"total_generic_settlements_paid": Decimal("0.00"),
|
||||
"total_generic_settlements_received": Decimal("0.00"),
|
||||
}
|
||||
|
||||
# Process Expenses
|
||||
for expense in expenses:
|
||||
if expense.paid_by_user_id in user_balances_data:
|
||||
user_balances_data[expense.paid_by_user_id]["total_paid_for_expenses"] += expense.total_amount
|
||||
|
||||
for split in expense.splits:
|
||||
if split.user_id in user_balances_data:
|
||||
user_balances_data[split.user_id]["initial_total_share_of_expenses"] += split.owed_amount
|
||||
|
||||
# Process Settlement Activities (SettlementActivityModel)
|
||||
for activity in settlement_activities:
|
||||
if activity.paid_by_user_id in user_balances_data:
|
||||
user_balances_data[activity.paid_by_user_id]["total_amount_paid_via_settlement_activities"] += activity.amount_paid
|
||||
|
||||
# Process Generic Settlements (SettlementModel)
|
||||
for settlement in settlements:
|
||||
if settlement.paid_by_user_id in user_balances_data:
|
||||
user_balances_data[settlement.paid_by_user_id]["total_generic_settlements_paid"] += settlement.amount
|
||||
if settlement.paid_to_user_id in user_balances_data:
|
||||
user_balances_data[settlement.paid_to_user_id]["total_generic_settlements_received"] += settlement.amount
|
||||
|
||||
# Calculate Final Balances
|
||||
final_user_balances = []
|
||||
for user_id, data in user_balances_data.items():
|
||||
initial_total_share_of_expenses = data["initial_total_share_of_expenses"]
|
||||
total_amount_paid_via_settlement_activities = data["total_amount_paid_via_settlement_activities"]
|
||||
|
||||
adjusted_total_share_of_expenses = initial_total_share_of_expenses - total_amount_paid_via_settlement_activities
|
||||
|
||||
total_paid_for_expenses = data["total_paid_for_expenses"]
|
||||
total_generic_settlements_received = data["total_generic_settlements_received"]
|
||||
total_generic_settlements_paid = data["total_generic_settlements_paid"]
|
||||
|
||||
net_balance = (
|
||||
total_paid_for_expenses + total_generic_settlements_received
|
||||
) - (adjusted_total_share_of_expenses + total_generic_settlements_paid)
|
||||
|
||||
# Quantize all final values for UserBalanceDetail schema
|
||||
user_detail = UserBalanceDetail(
|
||||
user_id=data["user_id"],
|
||||
user_identifier=data["user_identifier"],
|
||||
total_paid_for_expenses=total_paid_for_expenses.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
# Store adjusted_total_share_of_expenses in total_share_of_expenses
|
||||
total_share_of_expenses=adjusted_total_share_of_expenses.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
# Store total_generic_settlements_paid in total_settlements_paid
|
||||
total_settlements_paid=total_generic_settlements_paid.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
total_settlements_received=total_generic_settlements_received.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
net_balance=net_balance.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP)
|
||||
)
|
||||
final_user_balances.append(user_detail)
|
||||
|
||||
# Sort by user identifier
|
||||
final_user_balances.sort(key=lambda x: x.user_identifier)
|
||||
|
||||
# Calculate suggested settlements
|
||||
suggested_settlements = calculate_suggested_settlements(final_user_balances)
|
||||
|
||||
# Calculate overall totals for the group
|
||||
overall_total_expenses = sum(expense.total_amount for expense in expenses)
|
||||
overall_total_settlements = sum(settlement.amount for settlement in settlements)
|
||||
|
||||
return GroupBalanceSummary(
|
||||
group_id=db_group_for_check.id,
|
||||
group_name=db_group_for_check.name,
|
||||
overall_total_expenses=overall_total_expenses.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
overall_total_settlements=overall_total_settlements.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
user_balances=final_user_balances,
|
||||
suggested_settlements=suggested_settlements
|
||||
try:
|
||||
return await costs_service.get_group_balance_summary_logic(
|
||||
db=db, group_id=group_id, current_user_id=current_user.id
|
||||
)
|
||||
except GroupPermissionError as e:
|
||||
logger.warning(f"Permission denied for user {current_user.email} on group {group_id}: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail=str(e))
|
||||
except GroupNotFoundError as e:
|
||||
logger.warning(f"Group {group_id} not found when getting balance summary: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
|
@ -4,7 +4,7 @@ from fastapi import APIRouter, Depends, HTTPException, status, Query, Response
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.orm import joinedload
|
||||
from typing import List as PyList, Optional, Sequence
|
||||
from typing import List as PyList, Optional, Sequence, Union
|
||||
|
||||
from app.database import get_transactional_session
|
||||
from app.auth import current_active_user
|
||||
@ -14,13 +14,16 @@ from app.models import (
|
||||
List as ListModel,
|
||||
UserGroup as UserGroupModel,
|
||||
UserRoleEnum,
|
||||
ExpenseSplit as ExpenseSplitModel
|
||||
ExpenseSplit as ExpenseSplitModel,
|
||||
Expense as ExpenseModel,
|
||||
Settlement as SettlementModel
|
||||
)
|
||||
from app.schemas.expense import (
|
||||
ExpenseCreate, ExpensePublic,
|
||||
SettlementCreate, SettlementPublic,
|
||||
ExpenseUpdate, SettlementUpdate
|
||||
)
|
||||
from app.schemas.financials import FinancialActivityResponse
|
||||
from app.schemas.settlement_activity import SettlementActivityCreate, SettlementActivityPublic # Added
|
||||
from app.crud import expense as crud_expense
|
||||
from app.crud import settlement as crud_settlement
|
||||
@ -30,8 +33,9 @@ from app.crud import list as crud_list
|
||||
from app.core.exceptions import (
|
||||
ListNotFoundError, GroupNotFoundError, UserNotFoundError,
|
||||
InvalidOperationError, GroupPermissionError, ListPermissionError,
|
||||
ItemNotFoundError, GroupMembershipError
|
||||
ItemNotFoundError, GroupMembershipError, OverpaymentError, FinancialConflictError
|
||||
)
|
||||
from app.services import financials_service
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
@ -75,9 +79,17 @@ async def create_new_expense(
|
||||
effective_group_id = list_obj.group_id
|
||||
is_group_context = True # Expense is tied to a group via the list
|
||||
elif expense_in.group_id:
|
||||
raise InvalidOperationError(f"Personal list {list_obj.id} cannot have expense associated with group {expense_in.group_id}.")
|
||||
# If list is personal, no group check needed yet, handled by payer check below.
|
||||
|
||||
# Allow linking a personal list to a group expense (see TODO issue #6).
|
||||
# We validate that the current user is a member of the specified group so
|
||||
# they cannot attach their personal list to an arbitrary group.
|
||||
effective_group_id = expense_in.group_id
|
||||
is_group_context = True
|
||||
await crud_group.check_group_membership(
|
||||
db,
|
||||
group_id=effective_group_id,
|
||||
user_id=current_user.id,
|
||||
action="create expense from personal list for group"
|
||||
)
|
||||
elif effective_group_id: # Only group_id provided for expense
|
||||
is_group_context = True
|
||||
# Ensure user is at least a member to create expense in group context
|
||||
@ -167,8 +179,12 @@ async def list_expenses(
|
||||
expenses = await crud_expense.get_user_accessible_expenses(db, user_id=current_user.id, skip=skip, limit=limit)
|
||||
|
||||
# Apply recurring filter if specified
|
||||
# NOTE: the original code referenced a non-existent ``expense.recurrence_rule`` attribute.
|
||||
# The canonical way to know if an expense is recurring is the ``is_recurring`` flag
|
||||
# (and/or the presence of a ``recurrence_pattern``). We use ``is_recurring`` here
|
||||
# because it is explicit, indexed and does not require an extra JOIN.
|
||||
if isRecurring is not None:
|
||||
expenses = [expense for expense in expenses if bool(expense.recurrence_rule) == isRecurring]
|
||||
expenses = [expense for expense in expenses if expense.is_recurring == isRecurring]
|
||||
|
||||
return expenses
|
||||
|
||||
@ -413,6 +429,10 @@ async def record_settlement_for_expense_split(
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"User referenced in settlement activity not found: {str(e)}")
|
||||
except InvalidOperationError as e:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
except OverpaymentError as e:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
except FinancialConflictError as e:
|
||||
raise HTTPException(status_code=status.HTTP_409_CONFLICT, detail=str(e))
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error recording settlement activity for expense_split_id {expense_split_id}: {str(e)}", exc_info=True)
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="An unexpected error occurred while recording settlement activity.")
|
||||
@ -567,23 +587,21 @@ async def update_settlement_details(
|
||||
|
||||
# --- Granular Permission Check ---
|
||||
can_modify = False
|
||||
# 1. User is involved party (payer or payee)
|
||||
is_party = current_user.id in [settlement_db.paid_by_user_id, settlement_db.paid_to_user_id]
|
||||
if is_party:
|
||||
# 1. Original creator may modify their own record
|
||||
if settlement_db.created_by_user_id == current_user.id:
|
||||
can_modify = True
|
||||
# 2. OR User is owner of the group the settlement belongs to
|
||||
# Note: Settlements always have a group_id based on current model
|
||||
# 2. Otherwise only a group owner may modify
|
||||
elif settlement_db.group_id:
|
||||
try:
|
||||
await crud_group.check_user_role_in_group(db, group_id=settlement_db.group_id, user_id=current_user.id, required_role=UserRoleEnum.owner, action="modify group settlements")
|
||||
await crud_group.check_user_role_in_group(
|
||||
db,
|
||||
group_id=settlement_db.group_id,
|
||||
user_id=current_user.id,
|
||||
required_role=UserRoleEnum.owner,
|
||||
action="modify group settlements created by others"
|
||||
)
|
||||
can_modify = True
|
||||
logger.info(f"Allowing update for settlement {settlement_id} by group owner {current_user.email}")
|
||||
except GroupMembershipError:
|
||||
pass
|
||||
except GroupPermissionError:
|
||||
pass
|
||||
except GroupNotFoundError:
|
||||
logger.error(f"Group {settlement_db.group_id} not found for settlement {settlement_id} during update check.")
|
||||
except (GroupMembershipError, GroupPermissionError, GroupNotFoundError):
|
||||
pass
|
||||
|
||||
if not can_modify:
|
||||
@ -622,22 +640,19 @@ async def delete_settlement_record(
|
||||
|
||||
# --- Granular Permission Check ---
|
||||
can_delete = False
|
||||
# 1. User is involved party (payer or payee)
|
||||
is_party = current_user.id in [settlement_db.paid_by_user_id, settlement_db.paid_to_user_id]
|
||||
if is_party:
|
||||
can_delete = True
|
||||
# 2. OR User is owner of the group the settlement belongs to
|
||||
elif settlement_db.group_id:
|
||||
# Only a group owner can delete a settlement (regardless of who created it)
|
||||
if settlement_db.group_id:
|
||||
try:
|
||||
await crud_group.check_user_role_in_group(db, group_id=settlement_db.group_id, user_id=current_user.id, required_role=UserRoleEnum.owner, action="delete group settlements")
|
||||
await crud_group.check_user_role_in_group(
|
||||
db,
|
||||
group_id=settlement_db.group_id,
|
||||
user_id=current_user.id,
|
||||
required_role=UserRoleEnum.owner,
|
||||
action="delete group settlements"
|
||||
)
|
||||
can_delete = True
|
||||
logger.info(f"Allowing delete for settlement {settlement_id} by group owner {current_user.email}")
|
||||
except GroupMembershipError:
|
||||
pass
|
||||
except GroupPermissionError:
|
||||
pass
|
||||
except GroupNotFoundError:
|
||||
logger.error(f"Group {settlement_db.group_id} not found for settlement {settlement_id} during delete check.")
|
||||
except (GroupMembershipError, GroupPermissionError, GroupNotFoundError):
|
||||
pass
|
||||
|
||||
if not can_delete:
|
||||
@ -656,3 +671,20 @@ async def delete_settlement_record(
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="An unexpected error occurred.")
|
||||
|
||||
return Response(status_code=status.HTTP_204_NO_CONTENT)
|
||||
|
||||
@router.get("/users/me/financial-activity", response_model=FinancialActivityResponse, summary="Get User's Financial Activity", tags=["Users", "Expenses", "Settlements"])
|
||||
async def get_user_financial_activity(
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Retrieves a consolidated and chronologically sorted list of all financial activities
|
||||
for the current user, including expenses they are part of and settlements they have
|
||||
made or received.
|
||||
"""
|
||||
logger.info(f"User {current_user.email} requesting their financial activity feed.")
|
||||
activities = await financials_service.get_user_financial_activity(db=db, user_id=current_user.id)
|
||||
|
||||
# The service returns a mix of ExpenseModel and SettlementModel objects.
|
||||
# We need to wrap it in our response schema. Pydantic will handle the Union type.
|
||||
return FinancialActivityResponse(activities=activities)
|
@ -1,27 +1,31 @@
|
||||
# app/api/v1/endpoints/groups.py
|
||||
import logging
|
||||
from typing import List
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from fastapi import APIRouter, Depends, HTTPException, status, Query
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.database import get_transactional_session, get_session
|
||||
from app.auth import current_active_user
|
||||
from app.models import User as UserModel, UserRoleEnum # Import model and enum
|
||||
from app.schemas.group import GroupCreate, GroupPublic
|
||||
from app.models import User as UserModel, UserRoleEnum
|
||||
from app.schemas.group import GroupCreate, GroupPublic, GroupScheduleGenerateRequest, GroupDelete
|
||||
from app.schemas.invite import InviteCodePublic
|
||||
from app.schemas.message import Message # For simple responses
|
||||
from app.schemas.list import ListPublic, ListDetail
|
||||
from app.schemas.message import Message
|
||||
from app.schemas.list import ListDetail
|
||||
from app.schemas.chore import ChoreHistoryPublic, ChoreAssignmentPublic
|
||||
from app.schemas.user import UserPublic
|
||||
from app.crud import group as crud_group
|
||||
from app.crud import invite as crud_invite
|
||||
from app.crud import list as crud_list
|
||||
from app.crud import history as crud_history
|
||||
from app.crud import schedule as crud_schedule
|
||||
from app.core.exceptions import (
|
||||
GroupNotFoundError,
|
||||
GroupPermissionError,
|
||||
GroupMembershipError,
|
||||
GroupOperationError,
|
||||
GroupValidationError,
|
||||
InviteCreationError
|
||||
InviteCreationError,
|
||||
InvalidOperationError
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -42,8 +46,6 @@ async def create_group(
|
||||
"""Creates a new group, adding the creator as the owner."""
|
||||
logger.info(f"User {current_user.email} creating group: {group_in.name}")
|
||||
created_group = await crud_group.create_group(db=db, group_in=group_in, creator_id=current_user.id)
|
||||
# Load members explicitly if needed for the response (optional here)
|
||||
# created_group = await crud_group.get_group_by_id(db, created_group.id)
|
||||
return created_group
|
||||
|
||||
|
||||
@ -54,7 +56,7 @@ async def create_group(
|
||||
tags=["Groups"]
|
||||
)
|
||||
async def read_user_groups(
|
||||
db: AsyncSession = Depends(get_session), # Use read-only session for GET
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all groups the current user is a member of."""
|
||||
@ -71,12 +73,11 @@ async def read_user_groups(
|
||||
)
|
||||
async def read_group(
|
||||
group_id: int,
|
||||
db: AsyncSession = Depends(get_session), # Use read-only session for GET
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves details for a specific group, including members, if the user is part of it."""
|
||||
logger.info(f"User {current_user.email} requesting details for group ID: {group_id}")
|
||||
# Check if user is a member first
|
||||
is_member = await crud_group.is_user_member(db=db, group_id=group_id, user_id=current_user.id)
|
||||
if not is_member:
|
||||
logger.warning(f"Access denied: User {current_user.email} not member of group {group_id}")
|
||||
@ -89,6 +90,31 @@ async def read_group(
|
||||
|
||||
return group
|
||||
|
||||
@router.get(
|
||||
"/{group_id}/members",
|
||||
response_model=List[UserPublic],
|
||||
summary="Get Group Members",
|
||||
tags=["Groups"]
|
||||
)
|
||||
async def read_group_members(
|
||||
group_id: int,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all members of a specific group, if the user is part of it."""
|
||||
logger.info(f"User {current_user.email} requesting members for group ID: {group_id}")
|
||||
|
||||
is_member = await crud_group.is_user_member(db=db, group_id=group_id, user_id=current_user.id)
|
||||
if not is_member:
|
||||
logger.warning(f"Access denied: User {current_user.email} not member of group {group_id}")
|
||||
raise GroupMembershipError(group_id, "view group members")
|
||||
|
||||
group = await crud_group.get_group_by_id(db=db, group_id=group_id)
|
||||
if not group:
|
||||
logger.error(f"Group {group_id} requested by member {current_user.email} not found (data inconsistency?)")
|
||||
raise GroupNotFoundError(group_id)
|
||||
|
||||
return [member_assoc.user for member_assoc in group.member_associations]
|
||||
|
||||
@router.post(
|
||||
"/{group_id}/invites",
|
||||
@ -105,12 +131,10 @@ async def create_group_invite(
|
||||
logger.info(f"User {current_user.email} attempting to create invite for group {group_id}")
|
||||
user_role = await crud_group.get_user_role_in_group(db, group_id=group_id, user_id=current_user.id)
|
||||
|
||||
# --- Permission Check (MVP: Owner only) ---
|
||||
if user_role != UserRoleEnum.owner:
|
||||
logger.warning(f"Permission denied: User {current_user.email} (role: {user_role}) cannot create invite for group {group_id}")
|
||||
raise GroupPermissionError(group_id, "create invites")
|
||||
|
||||
# Check if group exists (implicitly done by role check, but good practice)
|
||||
group = await crud_group.get_group_by_id(db, group_id)
|
||||
if not group:
|
||||
raise GroupNotFoundError(group_id)
|
||||
@ -118,7 +142,6 @@ async def create_group_invite(
|
||||
invite = await crud_invite.create_invite(db=db, group_id=group_id, creator_id=current_user.id)
|
||||
if not invite:
|
||||
logger.error(f"Failed to generate unique invite code for group {group_id}")
|
||||
# This case should ideally be covered by exceptions from create_invite now
|
||||
raise InviteCreationError(group_id)
|
||||
|
||||
logger.info(f"User {current_user.email} created invite code for group {group_id}")
|
||||
@ -132,26 +155,20 @@ async def create_group_invite(
|
||||
)
|
||||
async def get_group_active_invite(
|
||||
group_id: int,
|
||||
db: AsyncSession = Depends(get_session), # Use read-only session for GET
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves the active invite code for the group. Requires group membership (owner/admin to be stricter later if needed)."""
|
||||
logger.info(f"User {current_user.email} attempting to get active invite for group {group_id}")
|
||||
|
||||
# Permission check: Ensure user is a member of the group to view invite code
|
||||
# Using get_user_role_in_group which also checks membership indirectly
|
||||
user_role = await crud_group.get_user_role_in_group(db, group_id=group_id, user_id=current_user.id)
|
||||
if user_role is None: # Not a member
|
||||
logger.warning(f"Permission denied: User {current_user.email} is not a member of group {group_id} and cannot view invite code.")
|
||||
# More specific error or let GroupPermissionError handle if we want to be generic
|
||||
raise GroupMembershipError(group_id, "view invite code for this group (not a member)")
|
||||
|
||||
# Fetch the active invite for the group
|
||||
invite = await crud_invite.get_active_invite_for_group(db, group_id=group_id)
|
||||
|
||||
if not invite:
|
||||
# This case means no active (non-expired, active=true) invite exists.
|
||||
# The frontend can then prompt to generate one.
|
||||
logger.info(f"No active invite code found for group {group_id} when requested by {current_user.email}")
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
@ -159,7 +176,50 @@ async def get_group_active_invite(
|
||||
)
|
||||
|
||||
logger.info(f"User {current_user.email} retrieved active invite code for group {group_id}")
|
||||
return invite # Pydantic will convert InviteModel to InviteCodePublic
|
||||
return invite
|
||||
|
||||
@router.delete(
|
||||
"/{group_id}",
|
||||
response_model=Message,
|
||||
summary="Delete Group (Owner Only)",
|
||||
tags=["Groups"]
|
||||
)
|
||||
async def delete_group(
|
||||
group_id: int,
|
||||
delete_confirmation: GroupDelete,
|
||||
expected_version: int | None = Query(None, description="Current version for optimistic locking"),
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Permanently deletes a group and all associated data. Requires owner role and explicit confirmation."""
|
||||
logger.info(f"Owner {current_user.email} attempting to delete group {group_id}")
|
||||
|
||||
# Check if user is owner
|
||||
user_role = await crud_group.get_user_role_in_group(db, group_id=group_id, user_id=current_user.id)
|
||||
if user_role != UserRoleEnum.owner:
|
||||
logger.warning(f"Permission denied: User {current_user.email} (role: {user_role}) cannot delete group {group_id}")
|
||||
raise GroupPermissionError(group_id, "delete group")
|
||||
|
||||
# Get group to verify name
|
||||
group = await crud_group.get_group_by_id(db, group_id)
|
||||
if not group:
|
||||
raise GroupNotFoundError(group_id)
|
||||
|
||||
# Verify confirmation name matches group name
|
||||
if delete_confirmation.confirmation_name != group.name:
|
||||
raise GroupValidationError(
|
||||
f"Confirmation name '{delete_confirmation.confirmation_name}' does not match group name '{group.name}'"
|
||||
)
|
||||
|
||||
# Delete the group
|
||||
try:
|
||||
await crud_group.delete_group(db=db, group_id=group_id, expected_version=expected_version)
|
||||
except InvalidOperationError as e:
|
||||
status_code = status.HTTP_409_CONFLICT if "version" in str(e).lower() else status.HTTP_400_BAD_REQUEST
|
||||
raise HTTPException(status_code=status_code, detail=str(e))
|
||||
|
||||
logger.info(f"Group {group_id} successfully deleted by owner {current_user.email}")
|
||||
return Message(detail="Group successfully deleted")
|
||||
|
||||
@router.delete(
|
||||
"/{group_id}/leave",
|
||||
@ -172,34 +232,31 @@ async def leave_group(
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Removes the current user from the specified group. If the owner is the last member, the group will be deleted."""
|
||||
"""Removes the current user from the specified group. If the user is the owner and last member, the group will be archived."""
|
||||
logger.info(f"User {current_user.email} attempting to leave group {group_id}")
|
||||
user_role = await crud_group.get_user_role_in_group(db, group_id=group_id, user_id=current_user.id)
|
||||
|
||||
if user_role is None:
|
||||
raise GroupMembershipError(group_id, "leave (you are not a member)")
|
||||
|
||||
# Check if owner is the last member
|
||||
if user_role == UserRoleEnum.owner:
|
||||
member_count = await crud_group.get_group_member_count(db, group_id)
|
||||
if member_count <= 1:
|
||||
# Delete the group since owner is the last member
|
||||
logger.info(f"Owner {current_user.email} is the last member. Deleting group {group_id}")
|
||||
await crud_group.delete_group(db, group_id)
|
||||
return Message(detail="Group deleted as you were the last member")
|
||||
logger.info(f"Owner {current_user.email} is the last member. Group {group_id} will be archived.")
|
||||
# TODO: Implement group archiving logic here
|
||||
# For now, we'll just remove the user but keep the group
|
||||
await crud_group.remove_user_from_group(db, group_id=group_id, user_id=current_user.id)
|
||||
return Message(detail="You have left the group. As you were the last member, the group has been archived.")
|
||||
|
||||
# Proceed with removal for non-owner or if there are other members
|
||||
deleted = await crud_group.remove_user_from_group(db, group_id=group_id, user_id=current_user.id)
|
||||
|
||||
if not deleted:
|
||||
# Should not happen if role check passed, but handle defensively
|
||||
logger.error(f"Failed to remove user {current_user.email} from group {group_id} despite being a member.")
|
||||
raise GroupOperationError("Failed to leave group")
|
||||
|
||||
logger.info(f"User {current_user.email} successfully left group {group_id}")
|
||||
return Message(detail="Successfully left the group")
|
||||
|
||||
# --- Optional: Remove Member Endpoint ---
|
||||
@router.delete(
|
||||
"/{group_id}/members/{user_id_to_remove}",
|
||||
response_model=Message,
|
||||
@ -216,21 +273,17 @@ async def remove_group_member(
|
||||
logger.info(f"Owner {current_user.email} attempting to remove user {user_id_to_remove} from group {group_id}")
|
||||
owner_role = await crud_group.get_user_role_in_group(db, group_id=group_id, user_id=current_user.id)
|
||||
|
||||
# --- Permission Check ---
|
||||
if owner_role != UserRoleEnum.owner:
|
||||
logger.warning(f"Permission denied: User {current_user.email} (role: {owner_role}) cannot remove members from group {group_id}")
|
||||
raise GroupPermissionError(group_id, "remove members")
|
||||
|
||||
# Prevent owner removing themselves via this endpoint
|
||||
if current_user.id == user_id_to_remove:
|
||||
raise GroupValidationError("Owner cannot remove themselves using this endpoint. Use 'Leave Group' instead.")
|
||||
|
||||
# Check if target user is actually in the group
|
||||
target_role = await crud_group.get_user_role_in_group(db, group_id=group_id, user_id=user_id_to_remove)
|
||||
if target_role is None:
|
||||
raise GroupMembershipError(group_id, "remove this user (they are not a member)")
|
||||
|
||||
# Proceed with removal
|
||||
deleted = await crud_group.remove_user_from_group(db, group_id=group_id, user_id=user_id_to_remove)
|
||||
|
||||
if not deleted:
|
||||
@ -248,20 +301,67 @@ async def remove_group_member(
|
||||
)
|
||||
async def read_group_lists(
|
||||
group_id: int,
|
||||
db: AsyncSession = Depends(get_session), # Use read-only session for GET
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all lists belonging to a specific group, if the user is a member."""
|
||||
logger.info(f"User {current_user.email} requesting lists for group ID: {group_id}")
|
||||
|
||||
# Check if user is a member first
|
||||
is_member = await crud_group.is_user_member(db=db, group_id=group_id, user_id=current_user.id)
|
||||
if not is_member:
|
||||
logger.warning(f"Access denied: User {current_user.email} not member of group {group_id}")
|
||||
raise GroupMembershipError(group_id, "view group lists")
|
||||
|
||||
# Get all lists for the user and filter by group_id
|
||||
lists = await crud_list.get_lists_for_user(db=db, user_id=current_user.id)
|
||||
group_lists = [list for list in lists if list.group_id == group_id]
|
||||
|
||||
return group_lists
|
||||
|
||||
@router.post(
|
||||
"/{group_id}/chores/generate-schedule",
|
||||
response_model=List[ChoreAssignmentPublic],
|
||||
summary="Generate Group Chore Schedule",
|
||||
tags=["Groups", "Chores"]
|
||||
)
|
||||
async def generate_group_chore_schedule(
|
||||
group_id: int,
|
||||
schedule_in: GroupScheduleGenerateRequest,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Generates a round-robin chore schedule for a group."""
|
||||
logger.info(f"User {current_user.email} generating chore schedule for group {group_id}")
|
||||
if not await crud_group.is_user_member(db, group_id, current_user.id):
|
||||
raise GroupMembershipError(group_id, "generate chore schedule for this group")
|
||||
|
||||
try:
|
||||
assignments = await crud_schedule.generate_group_chore_schedule(
|
||||
db=db,
|
||||
group_id=group_id,
|
||||
start_date=schedule_in.start_date,
|
||||
end_date=schedule_in.end_date,
|
||||
user_id=current_user.id,
|
||||
member_ids=schedule_in.member_ids,
|
||||
)
|
||||
return assignments
|
||||
except Exception as e:
|
||||
logger.error(f"Error generating schedule for group {group_id}: {e}", exc_info=True)
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=str(e))
|
||||
|
||||
@router.get(
|
||||
"/{group_id}/chores/history",
|
||||
response_model=List[ChoreHistoryPublic],
|
||||
summary="Get Group Chore History",
|
||||
tags=["Groups", "Chores", "History"]
|
||||
)
|
||||
async def get_group_chore_history(
|
||||
group_id: int,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Retrieves all chore-related history for a specific group."""
|
||||
logger.info(f"User {current_user.email} requesting chore history for group {group_id}")
|
||||
if not await crud_group.is_user_member(db, group_id, current_user.id):
|
||||
raise GroupMembershipError(group_id, "view chore history for this group")
|
||||
|
||||
return await crud_history.get_group_chore_history(db=db, group_id=group_id)
|
@ -1,4 +1,3 @@
|
||||
# app/api/v1/endpoints/health.py
|
||||
import logging
|
||||
from fastapi import APIRouter, Depends
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
@ -7,7 +6,6 @@ from sqlalchemy.sql import text
|
||||
from app.database import get_transactional_session
|
||||
from app.schemas.health import HealthStatus
|
||||
from app.core.exceptions import DatabaseConnectionError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
@ -22,17 +20,9 @@ async def check_health(db: AsyncSession = Depends(get_transactional_session)):
|
||||
"""
|
||||
Health check endpoint. Verifies API reachability and database connection.
|
||||
"""
|
||||
try:
|
||||
# Try executing a simple query to check DB connection
|
||||
result = await db.execute(text("SELECT 1"))
|
||||
if result.scalar_one() == 1:
|
||||
logger.info("Health check successful: Database connection verified.")
|
||||
return HealthStatus(status="ok", database="connected")
|
||||
else:
|
||||
# This case should ideally not happen with 'SELECT 1'
|
||||
logger.error("Health check failed: Database connection check returned unexpected result.")
|
||||
raise DatabaseConnectionError("Unexpected result from database connection check")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Health check failed: Database connection error - {e}", exc_info=True)
|
||||
raise DatabaseConnectionError(str(e))
|
46
be/app/api/v1/endpoints/history.py
Normal file
46
be/app/api/v1/endpoints/history.py
Normal file
@ -0,0 +1,46 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from typing import List
|
||||
|
||||
from app import models
|
||||
from app.schemas.audit import FinancialAuditLogPublic
|
||||
from app.database import get_session
|
||||
from app.auth import current_active_user
|
||||
from app.crud import audit as crud_audit, group as crud_group
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.get("/financial/group/{group_id}", response_model=List[FinancialAuditLogPublic])
|
||||
async def read_financial_history_for_group(
|
||||
group_id: int,
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: models.User = Depends(current_active_user),
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
):
|
||||
"""
|
||||
Retrieve financial audit history for a specific group.
|
||||
"""
|
||||
is_member = await crud_group.is_user_member(db, group_id=group_id, user_id=current_user.id)
|
||||
if not is_member:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Not a member of this group")
|
||||
|
||||
history = await crud_audit.get_financial_audit_logs_for_group(
|
||||
db=db, group_id=group_id, skip=skip, limit=limit
|
||||
)
|
||||
return history
|
||||
|
||||
@router.get("/financial/user/me", response_model=List[FinancialAuditLogPublic])
|
||||
async def read_financial_history_for_user(
|
||||
db: AsyncSession = Depends(get_session),
|
||||
current_user: models.User = Depends(current_active_user),
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
):
|
||||
"""
|
||||
Retrieve financial audit history for the current user.
|
||||
"""
|
||||
history = await crud_audit.get_financial_audit_logs_for_user(
|
||||
db=db, user_id=current_user.id, skip=skip, limit=limit
|
||||
)
|
||||
return history
|
@ -1,21 +1,16 @@
|
||||
# app/api/v1/endpoints/invites.py
|
||||
import logging
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from fastapi import APIRouter, Depends
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.database import get_transactional_session
|
||||
from app.auth import current_active_user
|
||||
from app.models import User as UserModel, UserRoleEnum
|
||||
from app.models import User as UserModel
|
||||
from app.schemas.invite import InviteAccept
|
||||
from app.schemas.message import Message
|
||||
from app.schemas.group import GroupPublic
|
||||
from app.crud import invite as crud_invite
|
||||
from app.crud import group as crud_group
|
||||
from app.core.exceptions import (
|
||||
InviteNotFoundError,
|
||||
InviteExpiredError,
|
||||
InviteAlreadyUsedError,
|
||||
InviteCreationError,
|
||||
GroupNotFoundError,
|
||||
GroupMembershipError,
|
||||
GroupOperationError
|
||||
@ -25,7 +20,7 @@ logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
@router.post(
|
||||
"/accept", # Route relative to prefix "/invites"
|
||||
"/accept",
|
||||
response_model=GroupPublic,
|
||||
summary="Accept Group Invite",
|
||||
tags=["Invites"]
|
||||
@ -38,41 +33,32 @@ async def accept_invite(
|
||||
"""Accepts a group invite using the provided invite code."""
|
||||
logger.info(f"User {current_user.email} attempting to accept invite code: {invite_in.code}")
|
||||
|
||||
# Get the invite - this function should only return valid, active invites
|
||||
invite = await crud_invite.get_active_invite_by_code(db, code=invite_in.code)
|
||||
if not invite:
|
||||
logger.warning(f"Invalid or inactive invite code attempted by user {current_user.email}: {invite_in.code}")
|
||||
# We can use a more generic error or a specific one. InviteNotFound is reasonable.
|
||||
raise InviteNotFoundError(invite_in.code)
|
||||
|
||||
# Check if group still exists
|
||||
group = await crud_group.get_group_by_id(db, group_id=invite.group_id)
|
||||
if not group:
|
||||
logger.error(f"Group {invite.group_id} not found for invite {invite_in.code}")
|
||||
raise GroupNotFoundError(invite.group_id)
|
||||
|
||||
# Check if user is already a member
|
||||
is_member = await crud_group.is_user_member(db, group_id=invite.group_id, user_id=current_user.id)
|
||||
if is_member:
|
||||
logger.warning(f"User {current_user.email} already a member of group {invite.group_id}")
|
||||
raise GroupMembershipError(invite.group_id, "join (already a member)")
|
||||
|
||||
# Add user to the group
|
||||
added_to_group = await crud_group.add_user_to_group(db, group_id=invite.group_id, user_id=current_user.id)
|
||||
if not added_to_group:
|
||||
logger.error(f"Failed to add user {current_user.email} to group {invite.group_id} during invite acceptance.")
|
||||
# This could be a race condition or other issue, treat as an operational error.
|
||||
raise GroupOperationError("Failed to add user to group.")
|
||||
|
||||
# Deactivate the invite so it cannot be used again
|
||||
await crud_invite.deactivate_invite(db, invite=invite)
|
||||
|
||||
logger.info(f"User {current_user.email} successfully joined group {invite.group_id} via invite {invite_in.code}")
|
||||
|
||||
# Re-fetch the group to get the updated member list
|
||||
updated_group = await crud_group.get_group_by_id(db, group_id=invite.group_id)
|
||||
if not updated_group:
|
||||
# This should ideally not happen as we found it before
|
||||
logger.error(f"Could not re-fetch group {invite.group_id} after user {current_user.email} joined.")
|
||||
raise GroupNotFoundError(invite.group_id)
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||
# app/api/v1/endpoints/items.py
|
||||
|
||||
import logging
|
||||
from typing import List as PyList, Optional
|
||||
|
||||
@ -6,21 +6,17 @@ from fastapi import APIRouter, Depends, HTTPException, status, Response, Query
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.database import get_transactional_session
|
||||
from app.auth import current_active_user
|
||||
# --- Import Models Correctly ---
|
||||
from app.models import User as UserModel
|
||||
from app.models import Item as ItemModel # <-- IMPORT Item and alias it
|
||||
# --- End Import Models ---
|
||||
from app.models import Item as ItemModel
|
||||
from app.schemas.item import ItemCreate, ItemUpdate, ItemPublic
|
||||
from app.crud import item as crud_item
|
||||
from app.crud import list as crud_list
|
||||
from app.core.exceptions import ItemNotFoundError, ListPermissionError, ConflictError
|
||||
from app.auth import current_active_user
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
# --- Helper Dependency for Item Permissions ---
|
||||
# Now ItemModel is defined before being used as a type hint
|
||||
async def get_item_and_verify_access(
|
||||
item_id: int,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
@ -31,19 +27,15 @@ async def get_item_and_verify_access(
|
||||
if not item_db:
|
||||
raise ItemNotFoundError(item_id)
|
||||
|
||||
# Check permission on the parent list
|
||||
try:
|
||||
await crud_list.check_list_permission(db=db, list_id=item_db.list_id, user_id=current_user.id)
|
||||
except ListPermissionError as e:
|
||||
# Re-raise with a more specific message
|
||||
raise ListPermissionError(item_db.list_id, "access this item's list")
|
||||
return item_db
|
||||
|
||||
|
||||
# --- Endpoints ---
|
||||
|
||||
@router.post(
|
||||
"/lists/{list_id}/items", # Nested under lists
|
||||
"/lists/{list_id}/items",
|
||||
response_model=ItemPublic,
|
||||
status_code=status.HTTP_201_CREATED,
|
||||
summary="Add Item to List",
|
||||
@ -56,13 +48,11 @@ async def create_list_item(
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""Adds a new item to a specific list. User must have access to the list."""
|
||||
user_email = current_user.email # Access email attribute before async operations
|
||||
user_email = current_user.email
|
||||
logger.info(f"User {user_email} adding item to list {list_id}: {item_in.name}")
|
||||
# Verify user has access to the target list
|
||||
try:
|
||||
await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id)
|
||||
except ListPermissionError as e:
|
||||
# Re-raise with a more specific message
|
||||
raise ListPermissionError(list_id, "add items to this list")
|
||||
|
||||
created_item = await crud_item.create_item(
|
||||
@ -73,7 +63,7 @@ async def create_list_item(
|
||||
|
||||
|
||||
@router.get(
|
||||
"/lists/{list_id}/items", # Nested under lists
|
||||
"/lists/{list_id}/items",
|
||||
response_model=PyList[ItemPublic],
|
||||
summary="List Items in List",
|
||||
tags=["Items"]
|
||||
@ -82,16 +72,13 @@ async def read_list_items(
|
||||
list_id: int,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
# Add sorting/filtering params later if needed: sort_by: str = 'created_at', order: str = 'asc'
|
||||
):
|
||||
"""Retrieves all items for a specific list if the user has access."""
|
||||
user_email = current_user.email # Access email attribute before async operations
|
||||
user_email = current_user.email
|
||||
logger.info(f"User {user_email} listing items for list {list_id}")
|
||||
# Verify user has access to the list
|
||||
try:
|
||||
await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id)
|
||||
except ListPermissionError as e:
|
||||
# Re-raise with a more specific message
|
||||
raise ListPermissionError(list_id, "view items in this list")
|
||||
|
||||
items = await crud_item.get_items_by_list_id(db=db, list_id=list_id)
|
||||
@ -99,7 +86,7 @@ async def read_list_items(
|
||||
|
||||
|
||||
@router.put(
|
||||
"/lists/{list_id}/items/{item_id}", # Nested under lists
|
||||
"/lists/{list_id}/items/{item_id}",
|
||||
response_model=ItemPublic,
|
||||
summary="Update Item",
|
||||
tags=["Items"],
|
||||
@ -111,9 +98,9 @@ async def update_item(
|
||||
list_id: int,
|
||||
item_id: int,
|
||||
item_in: ItemUpdate,
|
||||
item_db: ItemModel = Depends(get_item_and_verify_access), # Use dependency to get item and check list access
|
||||
item_db: ItemModel = Depends(get_item_and_verify_access),
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user), # Need user ID for completed_by
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Updates an item's details (name, quantity, is_complete, price).
|
||||
@ -122,9 +109,8 @@ async def update_item(
|
||||
If the version does not match, a 409 Conflict is returned.
|
||||
Sets/unsets `completed_by_id` based on `is_complete` flag.
|
||||
"""
|
||||
user_email = current_user.email # Access email attribute before async operations
|
||||
user_email = current_user.email
|
||||
logger.info(f"User {user_email} attempting to update item ID: {item_id} with version {item_in.version}")
|
||||
# Permission check is handled by get_item_and_verify_access dependency
|
||||
|
||||
try:
|
||||
updated_item = await crud_item.update_item(
|
||||
@ -141,7 +127,7 @@ async def update_item(
|
||||
|
||||
|
||||
@router.delete(
|
||||
"/lists/{list_id}/items/{item_id}", # Nested under lists
|
||||
"/lists/{list_id}/items/{item_id}",
|
||||
status_code=status.HTTP_204_NO_CONTENT,
|
||||
summary="Delete Item",
|
||||
tags=["Items"],
|
||||
@ -153,18 +139,16 @@ async def delete_item(
|
||||
list_id: int,
|
||||
item_id: int,
|
||||
expected_version: Optional[int] = Query(None, description="The expected version of the item to delete for optimistic locking."),
|
||||
item_db: ItemModel = Depends(get_item_and_verify_access), # Use dependency to get item and check list access
|
||||
item_db: ItemModel = Depends(get_item_and_verify_access),
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user), # Log who deleted it
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Deletes an item. User must have access to the list the item belongs to.
|
||||
If `expected_version` is provided and does not match the item's current version,
|
||||
a 409 Conflict is returned.
|
||||
"""
|
||||
user_email = current_user.email # Access email attribute before async operations
|
||||
logger.info(f"User {user_email} attempting to delete item ID: {item_id}, expected version: {expected_version}")
|
||||
# Permission check is handled by get_item_and_verify_access dependency
|
||||
user_email = current_user.email
|
||||
|
||||
if expected_version is not None and item_db.version != expected_version:
|
||||
logger.warning(
|
||||
|
@ -1,34 +1,27 @@
|
||||
# app/api/v1/endpoints/lists.py
|
||||
import logging
|
||||
from typing import List as PyList, Optional # Alias for Python List type hint
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status, Response, Query # Added Query
|
||||
from typing import List as PyList, Optional
|
||||
from fastapi import APIRouter, Depends, HTTPException, status, Response, Query
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.database import get_transactional_session
|
||||
from app.auth import current_active_user
|
||||
from app.models import User as UserModel
|
||||
from app.schemas.list import ListCreate, ListUpdate, ListPublic, ListDetail
|
||||
from app.schemas.message import Message # For simple responses
|
||||
from app.crud import list as crud_list
|
||||
from app.crud import group as crud_group # Need for group membership check
|
||||
from app.crud import group as crud_group
|
||||
from app.schemas.list import ListStatus, ListStatusWithId
|
||||
from app.schemas.expense import ExpensePublic # Import ExpensePublic
|
||||
from app.schemas.expense import ExpensePublic
|
||||
from app.core.exceptions import (
|
||||
GroupMembershipError,
|
||||
ListNotFoundError,
|
||||
ListPermissionError,
|
||||
ListStatusNotFoundError,
|
||||
ConflictError, # Added ConflictError
|
||||
DatabaseIntegrityError # Added DatabaseIntegrityError
|
||||
ConflictError,
|
||||
DatabaseIntegrityError
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
@router.post(
|
||||
"", # Route relative to prefix "/lists"
|
||||
response_model=ListPublic, # Return basic list info on creation
|
||||
"",
|
||||
response_model=ListPublic,
|
||||
status_code=status.HTTP_201_CREATED,
|
||||
summary="Create New List",
|
||||
tags=["Lists"],
|
||||
@ -53,7 +46,6 @@ async def create_list(
|
||||
logger.info(f"User {current_user.email} creating list: {list_in.name}")
|
||||
group_id = list_in.group_id
|
||||
|
||||
# Permission Check: If sharing with a group, verify membership
|
||||
if group_id:
|
||||
is_member = await crud_group.is_user_member(db, group_id=group_id, user_id=current_user.id)
|
||||
if not is_member:
|
||||
@ -65,9 +57,7 @@ async def create_list(
|
||||
logger.info(f"List '{created_list.name}' (ID: {created_list.id}) created successfully for user {current_user.email}.")
|
||||
return created_list
|
||||
except DatabaseIntegrityError as e:
|
||||
# Check if this is a unique constraint violation
|
||||
if "unique constraint" in str(e).lower():
|
||||
# Find the existing list with the same name in the group
|
||||
existing_list = await crud_list.get_list_by_name_and_group(
|
||||
db=db,
|
||||
name=list_in.name,
|
||||
@ -81,20 +71,18 @@ async def create_list(
|
||||
detail=f"A list named '{list_in.name}' already exists in this group.",
|
||||
headers={"X-Existing-List": str(existing_list.id)}
|
||||
)
|
||||
# If it's not a unique constraint or we couldn't find the existing list, re-raise
|
||||
raise
|
||||
|
||||
|
||||
@router.get(
|
||||
"", # Route relative to prefix "/lists"
|
||||
response_model=PyList[ListDetail], # Return a list of detailed list info including items
|
||||
"",
|
||||
response_model=PyList[ListDetail],
|
||||
summary="List Accessible Lists",
|
||||
tags=["Lists"]
|
||||
)
|
||||
async def read_lists(
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
# Add pagination parameters later if needed: skip: int = 0, limit: int = 100
|
||||
):
|
||||
"""
|
||||
Retrieves lists accessible to the current user:
|
||||
@ -106,6 +94,24 @@ async def read_lists(
|
||||
return lists
|
||||
|
||||
|
||||
@router.get(
|
||||
"/archived",
|
||||
response_model=PyList[ListDetail],
|
||||
summary="List Archived Lists",
|
||||
tags=["Lists"]
|
||||
)
|
||||
async def read_archived_lists(
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Retrieves archived lists for the current user.
|
||||
"""
|
||||
logger.info(f"Fetching archived lists for user: {current_user.email}")
|
||||
lists = await crud_list.get_lists_for_user(db=db, user_id=current_user.id, include_archived=True)
|
||||
return [l for l in lists if l.archived_at]
|
||||
|
||||
|
||||
@router.get(
|
||||
"/statuses",
|
||||
response_model=PyList[ListStatusWithId],
|
||||
@ -128,7 +134,6 @@ async def read_lists_statuses(
|
||||
|
||||
statuses = await crud_list.get_lists_statuses_by_ids(db=db, list_ids=ids, user_id=current_user.id)
|
||||
|
||||
# The CRUD function returns a list of Row objects, so we map them to the Pydantic model
|
||||
return [
|
||||
ListStatusWithId(
|
||||
id=s.id,
|
||||
@ -141,7 +146,7 @@ async def read_lists_statuses(
|
||||
|
||||
@router.get(
|
||||
"/{list_id}",
|
||||
response_model=ListDetail, # Return detailed list info including items
|
||||
response_model=ListDetail,
|
||||
summary="Get List Details",
|
||||
tags=["Lists"]
|
||||
)
|
||||
@ -155,17 +160,16 @@ async def read_list(
|
||||
if the user has permission (creator or group member).
|
||||
"""
|
||||
logger.info(f"User {current_user.email} requesting details for list ID: {list_id}")
|
||||
# The check_list_permission function will raise appropriate exceptions
|
||||
list_db = await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id)
|
||||
return list_db
|
||||
|
||||
|
||||
@router.put(
|
||||
"/{list_id}",
|
||||
response_model=ListPublic, # Return updated basic info
|
||||
response_model=ListPublic,
|
||||
summary="Update List",
|
||||
tags=["Lists"],
|
||||
responses={ # Add 409 to responses
|
||||
responses={
|
||||
status.HTTP_409_CONFLICT: {"description": "Conflict: List has been modified by someone else"}
|
||||
}
|
||||
)
|
||||
@ -188,43 +192,40 @@ async def update_list(
|
||||
updated_list = await crud_list.update_list(db=db, list_db=list_db, list_in=list_in)
|
||||
logger.info(f"List {list_id} updated successfully by user {current_user.email} to version {updated_list.version}.")
|
||||
return updated_list
|
||||
except ConflictError as e: # Catch and re-raise as HTTPException for proper FastAPI response
|
||||
except ConflictError as e:
|
||||
logger.warning(f"Conflict updating list {list_id} for user {current_user.email}: {str(e)}")
|
||||
raise HTTPException(status_code=status.HTTP_409_CONFLICT, detail=str(e))
|
||||
except Exception as e: # Catch other potential errors from crud operation
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating list {list_id} for user {current_user.email}: {str(e)}")
|
||||
# Consider a more generic error, but for now, let's keep it specific if possible
|
||||
# Re-raising might be better if crud layer already raises appropriate HTTPExceptions
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="An unexpected error occurred while updating the list.")
|
||||
|
||||
|
||||
@router.delete(
|
||||
"/{list_id}",
|
||||
status_code=status.HTTP_204_NO_CONTENT, # Standard for successful DELETE with no body
|
||||
summary="Delete List",
|
||||
status_code=status.HTTP_204_NO_CONTENT,
|
||||
summary="Archive List",
|
||||
tags=["Lists"],
|
||||
responses={ # Add 409 to responses
|
||||
status.HTTP_409_CONFLICT: {"description": "Conflict: List has been modified, cannot delete specified version"}
|
||||
responses={
|
||||
status.HTTP_409_CONFLICT: {"description": "Conflict: List has been modified, cannot archive specified version"}
|
||||
}
|
||||
)
|
||||
async def delete_list(
|
||||
async def archive_list_endpoint(
|
||||
list_id: int,
|
||||
expected_version: Optional[int] = Query(None, description="The expected version of the list to delete for optimistic locking."),
|
||||
expected_version: Optional[int] = Query(None, description="The expected version of the list to archive for optimistic locking."),
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Deletes a list. Requires user to be the creator of the list.
|
||||
Archives a list. Requires user to be the creator of the list.
|
||||
If `expected_version` is provided and does not match the list's current version,
|
||||
a 409 Conflict is returned.
|
||||
"""
|
||||
logger.info(f"User {current_user.email} attempting to delete list ID: {list_id}, expected version: {expected_version}")
|
||||
# Use the helper, requiring creator permission
|
||||
logger.info(f"User {current_user.email} attempting to archive list ID: {list_id}, expected version: {expected_version}")
|
||||
list_db = await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id, require_creator=True)
|
||||
|
||||
if expected_version is not None and list_db.version != expected_version:
|
||||
logger.warning(
|
||||
f"Conflict deleting list {list_id} for user {current_user.email}. "
|
||||
f"Conflict archiving list {list_id} for user {current_user.email}. "
|
||||
f"Expected version {expected_version}, actual version {list_db.version}."
|
||||
)
|
||||
raise HTTPException(
|
||||
@ -232,11 +233,37 @@ async def delete_list(
|
||||
detail=f"List has been modified. Expected version {expected_version}, but current version is {list_db.version}. Please refresh."
|
||||
)
|
||||
|
||||
await crud_list.delete_list(db=db, list_db=list_db)
|
||||
logger.info(f"List {list_id} (version: {list_db.version}) deleted successfully by user {current_user.email}.")
|
||||
await crud_list.archive_list(db=db, list_db=list_db)
|
||||
logger.info(f"List {list_id} (version: {list_db.version}) archived successfully by user {current_user.email}.")
|
||||
return Response(status_code=status.HTTP_204_NO_CONTENT)
|
||||
|
||||
|
||||
@router.post(
|
||||
"/{list_id}/unarchive",
|
||||
response_model=ListPublic,
|
||||
summary="Unarchive List",
|
||||
tags=["Lists"]
|
||||
)
|
||||
async def unarchive_list_endpoint(
|
||||
list_id: int,
|
||||
db: AsyncSession = Depends(get_transactional_session),
|
||||
current_user: UserModel = Depends(current_active_user),
|
||||
):
|
||||
"""
|
||||
Restores an archived list.
|
||||
"""
|
||||
logger.info(f"User {current_user.email} attempting to unarchive list ID: {list_id}")
|
||||
list_db = await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id, require_creator=True)
|
||||
|
||||
if not list_db.archived_at:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="List is not archived.")
|
||||
|
||||
updated_list = await crud_list.unarchive_list(db=db, list_db=list_db)
|
||||
|
||||
logger.info(f"List {list_id} unarchived successfully by user {current_user.email}.")
|
||||
return updated_list
|
||||
|
||||
|
||||
@router.get(
|
||||
"/{list_id}/status",
|
||||
response_model=ListStatus,
|
||||
@ -253,7 +280,6 @@ async def read_list_status(
|
||||
if the user has permission (creator or group member).
|
||||
"""
|
||||
logger.info(f"User {current_user.email} requesting status for list ID: {list_id}")
|
||||
# The check_list_permission is not needed here as get_list_status handles not found
|
||||
await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id)
|
||||
return await crud_list.get_list_status(db=db, list_id=list_id)
|
||||
|
||||
@ -278,9 +304,7 @@ async def read_list_expenses(
|
||||
|
||||
logger.info(f"User {current_user.email} requesting expenses for list ID: {list_id}")
|
||||
|
||||
# Check if user has permission to access this list
|
||||
await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user.id)
|
||||
|
||||
# Get expenses for this list
|
||||
expenses = await crud_expense.get_expenses_for_list(db, list_id=list_id, skip=skip, limit=limit)
|
||||
return expenses
|
@ -1,9 +1,5 @@
|
||||
import logging
|
||||
from typing import List
|
||||
|
||||
from fastapi import APIRouter, Depends, UploadFile, File, HTTPException, status
|
||||
from google.api_core import exceptions as google_exceptions
|
||||
|
||||
from fastapi import APIRouter, Depends, UploadFile, File
|
||||
from app.auth import current_active_user
|
||||
from app.models import User as UserModel
|
||||
from app.schemas.ocr import OcrExtractResponse
|
||||
@ -11,7 +7,6 @@ from app.core.gemini import GeminiOCRService, gemini_initialization_error
|
||||
from app.core.exceptions import (
|
||||
OCRServiceUnavailableError,
|
||||
OCRServiceConfigError,
|
||||
OCRUnexpectedError,
|
||||
OCRQuotaExceededError,
|
||||
InvalidFileTypeError,
|
||||
FileTooLargeError,
|
||||
@ -37,26 +32,22 @@ async def ocr_extract_items(
|
||||
Accepts an image upload, sends it to Gemini Flash with a prompt
|
||||
to extract shopping list items, and returns the parsed items.
|
||||
"""
|
||||
# Check if Gemini client initialized correctly
|
||||
if gemini_initialization_error:
|
||||
logger.error("OCR endpoint called but Gemini client failed to initialize.")
|
||||
raise OCRServiceUnavailableError(gemini_initialization_error)
|
||||
|
||||
logger.info(f"User {current_user.email} uploading image '{image_file.filename}' for OCR extraction.")
|
||||
|
||||
# --- File Validation ---
|
||||
if image_file.content_type not in settings.ALLOWED_IMAGE_TYPES:
|
||||
logger.warning(f"Invalid file type uploaded by {current_user.email}: {image_file.content_type}")
|
||||
raise InvalidFileTypeError()
|
||||
|
||||
# Simple size check
|
||||
contents = await image_file.read()
|
||||
if len(contents) > settings.MAX_FILE_SIZE_MB * 1024 * 1024:
|
||||
logger.warning(f"File too large uploaded by {current_user.email}: {len(contents)} bytes")
|
||||
raise FileTooLargeError()
|
||||
|
||||
try:
|
||||
# Use the ocr_service instance instead of the standalone function
|
||||
extracted_items = await ocr_service.extract_items(image_data=contents)
|
||||
|
||||
logger.info(f"Successfully extracted {len(extracted_items)} items for user {current_user.email}.")
|
||||
@ -72,5 +63,4 @@ async def ocr_extract_items(
|
||||
raise OCRProcessingError(str(e))
|
||||
|
||||
finally:
|
||||
# Ensure file handle is closed
|
||||
await image_file.close()
|
11
be/app/api/v1/endpoints/users.py
Normal file
11
be/app/api/v1/endpoints/users.py
Normal file
@ -0,0 +1,11 @@
|
||||
from fastapi import APIRouter
|
||||
from app.auth import fastapi_users
|
||||
from app.schemas.user import UserPublic, UserUpdate
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
router.include_router(
|
||||
fastapi_users.get_users_router(UserPublic, UserUpdate),
|
||||
prefix="",
|
||||
tags=["Users"],
|
||||
)
|
@ -21,11 +21,9 @@ from .database import get_session
|
||||
from .models import User
|
||||
from .config import settings
|
||||
|
||||
# OAuth2 configuration
|
||||
config = Config('.env')
|
||||
oauth = OAuth(config)
|
||||
|
||||
# Google OAuth2 setup
|
||||
oauth.register(
|
||||
name='google',
|
||||
server_metadata_url='https://accounts.google.com/.well-known/openid-configuration',
|
||||
@ -35,7 +33,6 @@ oauth.register(
|
||||
}
|
||||
)
|
||||
|
||||
# Apple OAuth2 setup
|
||||
oauth.register(
|
||||
name='apple',
|
||||
server_metadata_url='https://appleid.apple.com/.well-known/openid-configuration',
|
||||
@ -45,13 +42,11 @@ oauth.register(
|
||||
}
|
||||
)
|
||||
|
||||
# Custom Bearer Response with Refresh Token
|
||||
class BearerResponseWithRefresh(BaseModel):
|
||||
access_token: str
|
||||
refresh_token: str
|
||||
token_type: str = "bearer"
|
||||
|
||||
# Custom Bearer Transport that supports refresh tokens
|
||||
class BearerTransportWithRefresh(BearerTransport):
|
||||
async def get_login_response(self, token: str, refresh_token: str = None) -> Response:
|
||||
if refresh_token:
|
||||
@ -61,14 +56,12 @@ class BearerTransportWithRefresh(BearerTransport):
|
||||
token_type="bearer"
|
||||
)
|
||||
else:
|
||||
# Fallback to standard response if no refresh token
|
||||
bearer_response = {
|
||||
"access_token": token,
|
||||
"token_type": "bearer"
|
||||
}
|
||||
return JSONResponse(bearer_response.dict() if hasattr(bearer_response, 'dict') else bearer_response)
|
||||
|
||||
# Custom Authentication Backend with Refresh Token Support
|
||||
class AuthenticationBackendWithRefresh(AuthenticationBackend):
|
||||
def __init__(
|
||||
self,
|
||||
@ -83,7 +76,6 @@ class AuthenticationBackendWithRefresh(AuthenticationBackend):
|
||||
self.get_refresh_strategy = get_refresh_strategy
|
||||
|
||||
async def login(self, strategy, user) -> Response:
|
||||
# Generate both access and refresh tokens
|
||||
access_token = await strategy.write_token(user)
|
||||
refresh_strategy = self.get_refresh_strategy()
|
||||
refresh_token = await refresh_strategy.write_token(user)
|
||||
@ -118,23 +110,48 @@ class UserManager(IntegerIDMixin, BaseUserManager[User, int]):
|
||||
):
|
||||
print(f"User {user.id} has logged in.")
|
||||
|
||||
async def delete(self, user: User, safe: bool = False, request: Optional[Request] = None):
|
||||
"""Soft-delete and anonymize the user instead of removing the DB row.
|
||||
|
||||
This mitigates catastrophic data-loss cascades that can occur when the
|
||||
user row is physically deleted (see TODO issue #3). The record is kept
|
||||
for referential integrity, while all personally identifiable
|
||||
information (PII) is removed and the account is marked inactive.
|
||||
"""
|
||||
# Lazily import to avoid circular deps and heavy imports at startup
|
||||
from datetime import datetime, timezone
|
||||
|
||||
# Anonymise PII – keep a unique but meaningless email address
|
||||
anonymised_suffix = f"deleted_{user.id}_{int(datetime.now(timezone.utc).timestamp())}"
|
||||
user.email = f"user_{anonymised_suffix}@example.com"
|
||||
user.name = None
|
||||
user.hashed_password = ""
|
||||
user.is_active = False
|
||||
user.is_verified = False
|
||||
user.deleted_at = datetime.now(timezone.utc)
|
||||
user.is_deleted = True
|
||||
|
||||
# Persist the changes using the underlying user database adapter
|
||||
await self.user_db.update(user)
|
||||
|
||||
# We purposefully *do not* commit a hard delete, so any FK references
|
||||
# (expenses, lists, etc.) remain intact.
|
||||
return None
|
||||
|
||||
async def get_user_db(session: AsyncSession = Depends(get_session)):
|
||||
yield SQLAlchemyUserDatabase(session, User)
|
||||
|
||||
async def get_user_manager(user_db: SQLAlchemyUserDatabase = Depends(get_user_db)):
|
||||
yield UserManager(user_db)
|
||||
|
||||
# Updated transport with refresh token support
|
||||
bearer_transport = BearerTransportWithRefresh(tokenUrl="auth/jwt/login")
|
||||
bearer_transport = BearerTransportWithRefresh(tokenUrl="/api/v1/auth/jwt/login")
|
||||
|
||||
def get_jwt_strategy() -> JWTStrategy:
|
||||
return JWTStrategy(secret=settings.SECRET_KEY, lifetime_seconds=settings.ACCESS_TOKEN_EXPIRE_MINUTES * 60)
|
||||
|
||||
def get_refresh_jwt_strategy() -> JWTStrategy:
|
||||
# Refresh tokens last longer - 7 days
|
||||
return JWTStrategy(secret=settings.SECRET_KEY, lifetime_seconds=7 * 24 * 60 * 60)
|
||||
|
||||
# Updated auth backend with refresh token support
|
||||
auth_backend = AuthenticationBackendWithRefresh(
|
||||
name="jwt",
|
||||
transport=bearer_transport,
|
||||
|
197
be/app/config.py
197
be/app/config.py
@ -26,18 +26,168 @@ class Settings(BaseSettings):
|
||||
MAX_FILE_SIZE_MB: int = 10 # Maximum allowed file size for OCR processing
|
||||
ALLOWED_IMAGE_TYPES: list[str] = ["image/jpeg", "image/png", "image/webp"] # Supported image formats
|
||||
OCR_ITEM_EXTRACTION_PROMPT: str = """
|
||||
Extract the shopping list items from this image.
|
||||
List each distinct item on a new line.
|
||||
Ignore prices, quantities, store names, discounts, taxes, totals, and other non-item text.
|
||||
Focus only on the names of the products or items to be purchased.
|
||||
Add 2 underscores before and after the item name, if it is struck through.
|
||||
If the image does not appear to be a shopping list or receipt, state that clearly.
|
||||
Example output for a grocery list:
|
||||
Milk
|
||||
Eggs
|
||||
Bread
|
||||
__Apples__
|
||||
Organic Bananas
|
||||
**ROLE & GOAL**
|
||||
|
||||
You are an expert AI assistant specializing in Optical Character Recognition (OCR) and structured data extraction. Your primary function is to act as a "Shopping List Digitizer."
|
||||
|
||||
Your goal is to meticulously analyze the provided image of a shopping list, which is likely handwritten, and convert it into a structured, machine-readable JSON format. You must be accurate, infer context where necessary, and handle the inherent ambiguities of handwriting and informal list-making.
|
||||
|
||||
**INPUT**
|
||||
|
||||
You will receive a single image (`[Image]`). This image contains a shopping list. It may be:
|
||||
* Neatly written or very messy.
|
||||
* On lined paper, a whiteboard, a napkin, or a dedicated notepad.
|
||||
* Containing doodles, stains, or other visual noise.
|
||||
* Using various formats (bullet points, numbered lists, columns, simple line breaks).
|
||||
* could be in English or in German.
|
||||
|
||||
**CORE TASK: STEP-BY-STEP ANALYSIS**
|
||||
|
||||
Follow these steps precisely:
|
||||
|
||||
1. **Initial Image Analysis & OCR:**
|
||||
* Perform an advanced OCR scan on the entire image to transcribe all visible text.
|
||||
* Pay close attention to the spatial layout. Identify headings, columns, and line items. Note which text elements appear to be grouped together.
|
||||
|
||||
2. **Item Identification & Filtering:**
|
||||
* Differentiate between actual list items and non-item elements.
|
||||
* **INCLUDE:** Items intended for purchase.
|
||||
* **EXCLUDE:** List titles (e.g., "GROCERIES," "Target List"), dates, doodles, unrelated notes, or stray marks. Capture the list title separately if one exists.
|
||||
|
||||
3. **Detailed Extraction for Each Item:**
|
||||
For every single item you identify, extract the following attributes. If an attribute is not present, use `null`.
|
||||
|
||||
* `item_name` (string): The primary name of the product.
|
||||
* **Standardize:** Normalize the name. (e.g., "B. Powder" -> "Baking Powder", "A. Juice" -> "Apple Juice").
|
||||
* **Contextual Guessing:** If a word is poorly written, use the context of a shopping list to make an educated guess. (e.g., "Ciffee" is almost certainly "Coffee").
|
||||
|
||||
* `quantity` (number or string): The amount needed.
|
||||
* If a number is present (e.g., "**2** milks"), extract the number `2`.
|
||||
* If it's a word (e.g., "**a dozen** eggs"), extract the string `"a dozen"`.
|
||||
* If no quantity is specified (e.g., "Bread"), infer a default quantity of `1`.
|
||||
|
||||
* `unit` (string): The unit of measurement or packaging.
|
||||
* Examples: "kg", "lbs", "liters", "gallons", "box", "can", "bag", "bunch".
|
||||
* Infer where possible (e.g., for "2 Milks," the unit could be inferred as "cartons" or "gallons" depending on regional context, but it's safer to leave it `null` if not explicitly stated).
|
||||
|
||||
* `notes` (string): Any additional descriptive text.
|
||||
* Examples: "low-sodium," "organic," "brand name (Tide)," "for the cake," "get the ripe ones."
|
||||
|
||||
* `category` (string): Infer a logical category for the item.
|
||||
* Use common grocery store categories: `Produce`, `Dairy & Eggs`, `Meat & Seafood`, `Pantry`, `Frozen`, `Bakery`, `Beverages`, `Household`, `Personal Care`.
|
||||
* If the list itself has category headings (e.g., a "DAIRY" section), use those first.
|
||||
|
||||
* `original_text` (string): Provide the exact, unaltered text that your OCR transcribed for this entire line item. This is crucial for verification.
|
||||
|
||||
* `is_crossed_out` (boolean): Set to `true` if the item is struck through, crossed out, or clearly marked as completed. Otherwise, set to `false`.
|
||||
|
||||
**HANDLING AMBIGUITIES AND EDGE CASES**
|
||||
|
||||
* **Illegible Text:** If a line or word is completely unreadable, set `item_name` to `"UNREADABLE"` and place the garbled OCR attempt in the `original_text` field.
|
||||
* **Abbreviations:** Expand common shopping list abbreviations (e.g., "OJ" -> "Orange Juice", "TP" -> "Toilet Paper", "AVOs" -> "Avocados", "G. Beef" -> "Ground Beef").
|
||||
* **Implicit Items:** If a line is vague like "Snacks for kids," list it as is. Do not invent specific items.
|
||||
* **Multi-item Lines:** If a line contains multiple items (e.g., "Onions, Garlic, Ginger"), split them into separate item objects.
|
||||
|
||||
**OUTPUT FORMAT**
|
||||
|
||||
Your final output MUST be a single JSON object with the following structure. Do not include any explanatory text before or after the JSON block.
|
||||
|
||||
```json
|
||||
{
|
||||
"list_title": "string or null",
|
||||
"items": [
|
||||
{
|
||||
"item_name": "string",
|
||||
"quantity": "number or string",
|
||||
"unit": "string or null",
|
||||
"category": "string",
|
||||
"notes": "string or null",
|
||||
"original_text": "string",
|
||||
"is_crossed_out": "boolean"
|
||||
}
|
||||
],
|
||||
"summary": {
|
||||
"total_items": "integer",
|
||||
"unread_items": "integer",
|
||||
"crossed_out_items": "integer"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**EXAMPLE WALKTHROUGH**
|
||||
|
||||
* **IF THE IMAGE SHOWS:** A crumpled sticky note with the title "Stuff for tonight" and the items:
|
||||
* `2x Chicken Breasts`
|
||||
* `~~Baguette~~` (this item is crossed out)
|
||||
* `Salad mix (bag)`
|
||||
* `Tomatos` (misspelled)
|
||||
* `Choc Ice Cream`
|
||||
|
||||
* **YOUR JSON OUTPUT SHOULD BE:**
|
||||
|
||||
```json
|
||||
{
|
||||
"list_title": "Stuff for tonight",
|
||||
"items": [
|
||||
{
|
||||
"item_name": "Chicken Breasts",
|
||||
"quantity": 2,
|
||||
"unit": null,
|
||||
"category": "Meat & Seafood",
|
||||
"notes": null,
|
||||
"original_text": "2x Chicken Breasts",
|
||||
"is_crossed_out": false
|
||||
},
|
||||
{
|
||||
"item_name": "Baguette",
|
||||
"quantity": 1,
|
||||
"unit": null,
|
||||
"category": "Bakery",
|
||||
"notes": null,
|
||||
"original_text": "Baguette",
|
||||
"is_crossed_out": true
|
||||
},
|
||||
{
|
||||
"item_name": "Salad Mix",
|
||||
"quantity": 1,
|
||||
"unit": "bag",
|
||||
"category": "Produce",
|
||||
"notes": null,
|
||||
"original_text": "Salad mix (bag)",
|
||||
"is_crossed_out": false
|
||||
},
|
||||
{
|
||||
"item_name": "Tomatoes",
|
||||
"quantity": 1,
|
||||
"unit": null,
|
||||
"category": "Produce",
|
||||
"notes": null,
|
||||
"original_text": "Tomatos",
|
||||
"is_crossed_out": false
|
||||
},
|
||||
{
|
||||
"item_name": "Chocolate Ice Cream",
|
||||
"quantity": 1,
|
||||
"unit": null,
|
||||
"category": "Frozen",
|
||||
"notes": null,
|
||||
"original_text": "Choc Ice Cream",
|
||||
"is_crossed_out": false
|
||||
}
|
||||
],
|
||||
"summary": {
|
||||
"total_items": 5,
|
||||
"unread_items": 0,
|
||||
"crossed_out_items": 1
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**FINAL INSTRUCTION**
|
||||
|
||||
If the image provided is not a shopping list or is completely blank/unintelligible, respond with a JSON object where the `items` array is empty and add a note in the `list_title` field, such as "Image does not appear to be a shopping list."
|
||||
|
||||
Now, analyze the provided image and generate the JSON output.
|
||||
"""
|
||||
# --- OCR Error Messages ---
|
||||
OCR_SERVICE_UNAVAILABLE: str = "OCR service is currently unavailable. Please try again later."
|
||||
@ -49,7 +199,7 @@ Organic Bananas
|
||||
OCR_PROCESSING_ERROR: str = "Error processing image: {detail}"
|
||||
|
||||
# --- Gemini AI Settings ---
|
||||
GEMINI_MODEL_NAME: str = "gemini-2.0-flash" # The model to use for OCR
|
||||
GEMINI_MODEL_NAME: str = "gemini-2.5-flash-preview-05-20" # The model to use for OCR
|
||||
GEMINI_SAFETY_SETTINGS: dict = {
|
||||
"HARM_CATEGORY_HATE_SPEECH": "BLOCK_MEDIUM_AND_ABOVE",
|
||||
"HARM_CATEGORY_DANGEROUS_CONTENT": "BLOCK_MEDIUM_AND_ABOVE",
|
||||
@ -129,8 +279,10 @@ Organic Bananas
|
||||
APPLE_REDIRECT_URI: str = "https://mitlistbe.mohamad.dev/api/v1/auth/apple/callback"
|
||||
|
||||
# Session Settings
|
||||
SESSION_SECRET_KEY: str = "your-session-secret-key" # Change this in production
|
||||
ACCESS_TOKEN_EXPIRE_MINUTES: int = 480 # 8 hours instead of 30 minutes
|
||||
# Session secret is required; fail fast if not provided via environment.
|
||||
SESSION_SECRET_KEY: str | None = None # Must be set via env in production; fallback generated in dev/test
|
||||
# Shorter token lifetime to reduce risk if a token is leaked.
|
||||
ACCESS_TOKEN_EXPIRE_MINUTES: int = 60
|
||||
|
||||
# Redis Settings
|
||||
REDIS_URL: str = "redis://localhost:6379"
|
||||
@ -177,6 +329,18 @@ settings = Settings()
|
||||
if settings.DATABASE_URL is None:
|
||||
raise ValueError("DATABASE_URL environment variable must be set.")
|
||||
|
||||
# Dynamically generate a session secret in non-production environments to
|
||||
# maintain backwards-compatibility with local test setups while still failing
|
||||
# hard in production if a proper secret is missing.
|
||||
if not settings.SESSION_SECRET_KEY:
|
||||
if settings.is_production:
|
||||
raise ValueError("SESSION_SECRET_KEY environment variable must be set in production")
|
||||
else:
|
||||
import secrets as _secrets
|
||||
generated_secret = _secrets.token_urlsafe(32)
|
||||
object.__setattr__(settings, "SESSION_SECRET_KEY", generated_secret)
|
||||
logger.warning("SESSION_SECRET_KEY not provided; generated a temporary secret for development use.")
|
||||
|
||||
# Enforce secure secret key
|
||||
if not settings.SECRET_KEY:
|
||||
raise ValueError("SECRET_KEY environment variable must be set. Generate a secure key using: openssl rand -hex 32")
|
||||
@ -187,9 +351,6 @@ if len(settings.SECRET_KEY) < 32:
|
||||
|
||||
# Production-specific validations
|
||||
if settings.is_production:
|
||||
if settings.SESSION_SECRET_KEY == "your-session-secret-key":
|
||||
raise ValueError("SESSION_SECRET_KEY must be changed from default value in production")
|
||||
|
||||
if not settings.SENTRY_DSN:
|
||||
logger.warning("SENTRY_DSN not set in production environment. Error tracking will be unavailable.")
|
||||
|
||||
|
@ -1,28 +1,20 @@
|
||||
from typing import Dict, Any
|
||||
from app.config import settings
|
||||
|
||||
# API Version
|
||||
API_VERSION = "v1"
|
||||
|
||||
# API Prefix
|
||||
API_PREFIX = f"/api/{API_VERSION}"
|
||||
|
||||
# API Endpoints
|
||||
class APIEndpoints:
|
||||
# Auth
|
||||
AUTH = {
|
||||
"LOGIN": "/auth/login",
|
||||
"SIGNUP": "/auth/signup",
|
||||
"REFRESH_TOKEN": "/auth/refresh-token",
|
||||
}
|
||||
|
||||
# Users
|
||||
USERS = {
|
||||
"PROFILE": "/users/profile",
|
||||
"UPDATE_PROFILE": "/users/profile",
|
||||
}
|
||||
|
||||
# Lists
|
||||
LISTS = {
|
||||
"BASE": "/lists",
|
||||
"BY_ID": "/lists/{id}",
|
||||
@ -30,7 +22,6 @@ class APIEndpoints:
|
||||
"ITEM": "/lists/{list_id}/items/{item_id}",
|
||||
}
|
||||
|
||||
# Groups
|
||||
GROUPS = {
|
||||
"BASE": "/groups",
|
||||
"BY_ID": "/groups/{id}",
|
||||
@ -38,7 +29,6 @@ class APIEndpoints:
|
||||
"MEMBERS": "/groups/{group_id}/members",
|
||||
}
|
||||
|
||||
# Invites
|
||||
INVITES = {
|
||||
"BASE": "/invites",
|
||||
"BY_ID": "/invites/{id}",
|
||||
@ -46,12 +36,10 @@ class APIEndpoints:
|
||||
"DECLINE": "/invites/{id}/decline",
|
||||
}
|
||||
|
||||
# OCR
|
||||
OCR = {
|
||||
"PROCESS": "/ocr/process",
|
||||
}
|
||||
|
||||
# Financials
|
||||
FINANCIALS = {
|
||||
"EXPENSES": "/financials/expenses",
|
||||
"EXPENSE": "/financials/expenses/{id}",
|
||||
@ -59,12 +47,10 @@ class APIEndpoints:
|
||||
"SETTLEMENT": "/financials/settlements/{id}",
|
||||
}
|
||||
|
||||
# Health
|
||||
HEALTH = {
|
||||
"CHECK": "/health",
|
||||
}
|
||||
|
||||
# API Metadata
|
||||
API_METADATA = {
|
||||
"title": settings.API_TITLE,
|
||||
"description": settings.API_DESCRIPTION,
|
||||
@ -74,7 +60,6 @@ API_METADATA = {
|
||||
"redoc_url": settings.API_REDOC_URL,
|
||||
}
|
||||
|
||||
# API Tags
|
||||
API_TAGS = [
|
||||
{"name": "Authentication", "description": "Authentication and authorization endpoints"},
|
||||
{"name": "Users", "description": "User management endpoints"},
|
||||
@ -86,7 +71,7 @@ API_TAGS = [
|
||||
{"name": "Health", "description": "Health check endpoints"},
|
||||
]
|
||||
|
||||
# Helper function to get full API URL
|
||||
|
||||
def get_api_url(endpoint: str, **kwargs) -> str:
|
||||
"""
|
||||
Get the full API URL for an endpoint.
|
||||
|
78
be/app/core/cache.py
Normal file
78
be/app/core/cache.py
Normal file
@ -0,0 +1,78 @@
|
||||
import json
|
||||
import hashlib
|
||||
from functools import wraps
|
||||
from typing import Any, Callable, Optional
|
||||
from app.core.redis import get_redis
|
||||
import pickle
|
||||
|
||||
def generate_cache_key(func_name: str, args: tuple, kwargs: dict) -> str:
|
||||
"""Generate a unique cache key based on function name and arguments."""
|
||||
# Create a string representation of args and kwargs
|
||||
key_data = {
|
||||
'function': func_name,
|
||||
'args': str(args),
|
||||
'kwargs': str(sorted(kwargs.items()))
|
||||
}
|
||||
key_string = json.dumps(key_data, sort_keys=True)
|
||||
# Use SHA256 hash for consistent, shorter keys
|
||||
return f"cache:{hashlib.sha256(key_string.encode()).hexdigest()}"
|
||||
|
||||
def cache(expire_time: int = 3600, key_prefix: Optional[str] = None):
|
||||
"""
|
||||
Decorator to cache function results in Redis.
|
||||
|
||||
Args:
|
||||
expire_time: Expiration time in seconds (default: 1 hour)
|
||||
key_prefix: Optional prefix for cache keys
|
||||
"""
|
||||
def decorator(func: Callable) -> Callable:
|
||||
@wraps(func)
|
||||
async def wrapper(*args, **kwargs) -> Any:
|
||||
redis_client = await get_redis()
|
||||
|
||||
# Generate cache key
|
||||
cache_key = generate_cache_key(func.__name__, args, kwargs)
|
||||
if key_prefix:
|
||||
cache_key = f"{key_prefix}:{cache_key}"
|
||||
|
||||
try:
|
||||
# Try to get from cache
|
||||
cached_result = await redis_client.get(cache_key)
|
||||
if cached_result:
|
||||
# Deserialize and return cached result
|
||||
return pickle.loads(cached_result)
|
||||
|
||||
# Cache miss - execute function
|
||||
result = await func(*args, **kwargs)
|
||||
|
||||
# Store result in cache
|
||||
serialized_result = pickle.dumps(result)
|
||||
await redis_client.setex(cache_key, expire_time, serialized_result)
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
# If caching fails, still execute the function
|
||||
print(f"Cache error: {e}")
|
||||
return await func(*args, **kwargs)
|
||||
|
||||
return wrapper
|
||||
return decorator
|
||||
|
||||
async def invalidate_cache_pattern(pattern: str):
|
||||
"""Invalidate all cache keys matching a pattern."""
|
||||
redis_client = await get_redis()
|
||||
try:
|
||||
keys = await redis_client.keys(pattern)
|
||||
if keys:
|
||||
await redis_client.delete(*keys)
|
||||
except Exception as e:
|
||||
print(f"Cache invalidation error: {e}")
|
||||
|
||||
async def clear_all_cache():
|
||||
"""Clear all cache entries."""
|
||||
redis_client = await get_redis()
|
||||
try:
|
||||
await redis_client.flushdb()
|
||||
except Exception as e:
|
||||
print(f"Cache clear error: {e}")
|
@ -48,7 +48,6 @@ def calculate_next_due_date(
|
||||
today = date.today()
|
||||
reference_future_date = max(today, base_date)
|
||||
|
||||
# This loop ensures the next_due date is always in the future relative to the reference_future_date.
|
||||
while next_due <= reference_future_date:
|
||||
current_base_for_recalc = next_due
|
||||
|
||||
@ -70,9 +69,7 @@ def calculate_next_due_date(
|
||||
else: # Should not be reached
|
||||
break
|
||||
|
||||
# Safety break: if date hasn't changed, interval is zero or logic error.
|
||||
if next_due == current_base_for_recalc:
|
||||
# Log error ideally, then advance by one day to prevent infinite loop.
|
||||
next_due += timedelta(days=1)
|
||||
break
|
||||
|
||||
|
17
be/app/core/error_handlers.py
Normal file
17
be/app/core/error_handlers.py
Normal file
@ -0,0 +1,17 @@
|
||||
from fastapi import Request, HTTPException, status
|
||||
from fastapi.responses import JSONResponse
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
GENERIC_DB_ERROR = "Database error, please try again."
|
||||
GENERIC_SERVER_ERROR = "Internal server error. Please contact support if the problem persists."
|
||||
|
||||
async def sqlalchemy_exception_handler(request: Request, exc: SQLAlchemyError):
|
||||
logger.error("SQLAlchemyError", exc_info=exc)
|
||||
return JSONResponse(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, content={"detail": GENERIC_DB_ERROR})
|
||||
|
||||
async def generic_exception_handler(request: Request, exc: Exception):
|
||||
logger.error("Unhandled exception", exc_info=exc)
|
||||
return JSONResponse(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, content={"detail": GENERIC_SERVER_ERROR})
|
@ -258,14 +258,30 @@ class InviteOperationError(HTTPException):
|
||||
|
||||
class SettlementOperationError(HTTPException):
|
||||
"""Raised when a settlement operation fails."""
|
||||
def __init__(self, detail: str):
|
||||
def __init__(self, detail: str = "An error occurred during a settlement operation."):
|
||||
super().__init__(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=detail
|
||||
)
|
||||
|
||||
class FinancialConflictError(HTTPException):
|
||||
"""Raised when a financial operation conflicts with business logic."""
|
||||
def __init__(self, detail: str):
|
||||
super().__init__(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=detail
|
||||
)
|
||||
|
||||
class OverpaymentError(HTTPException):
|
||||
"""Raised when a settlement activity would cause overpayment."""
|
||||
def __init__(self, detail: str):
|
||||
super().__init__(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=detail
|
||||
)
|
||||
|
||||
class ConflictError(HTTPException):
|
||||
"""Raised when an optimistic lock version conflict occurs."""
|
||||
"""Raised when a conflict occurs."""
|
||||
def __init__(self, detail: str):
|
||||
super().__init__(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
@ -332,9 +348,21 @@ class UserOperationError(HTTPException):
|
||||
detail=detail
|
||||
)
|
||||
|
||||
class ChoreOperationError(HTTPException):
|
||||
"""Raised when a chore-related operation fails."""
|
||||
def __init__(self, detail: str):
|
||||
super().__init__(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=detail
|
||||
)
|
||||
|
||||
class ChoreNotFoundError(HTTPException):
|
||||
"""Raised when a chore is not found."""
|
||||
def __init__(self, chore_id: int, group_id: Optional[int] = None, detail: Optional[str] = None):
|
||||
"""Raised when a chore or assignment is not found."""
|
||||
def __init__(self, chore_id: int = None, assignment_id: int = None, group_id: Optional[int] = None, detail: Optional[str] = None):
|
||||
self.chore_id = chore_id
|
||||
self.assignment_id = assignment_id
|
||||
self.group_id = group_id
|
||||
|
||||
if detail:
|
||||
error_detail = detail
|
||||
elif group_id is not None:
|
||||
@ -354,4 +382,3 @@ class PermissionDeniedError(HTTPException):
|
||||
detail=detail
|
||||
)
|
||||
|
||||
# Financials & Cost Splitting specific errors
|
@ -1,8 +1,6 @@
|
||||
# app/core/gemini.py
|
||||
import logging
|
||||
from typing import List
|
||||
import google.generativeai as genai
|
||||
from google.generativeai.types import HarmCategory, HarmBlockThreshold # For safety settings
|
||||
from google.api_core import exceptions as google_exceptions
|
||||
from app.config import settings
|
||||
from app.core.exceptions import (
|
||||
@ -15,15 +13,12 @@ from app.core.exceptions import (
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# --- Global variable to hold the initialized model client ---
|
||||
gemini_flash_client = None
|
||||
gemini_initialization_error = None # Store potential init error
|
||||
gemini_initialization_error = None
|
||||
|
||||
# --- Configure and Initialize ---
|
||||
try:
|
||||
if settings.GEMINI_API_KEY:
|
||||
genai.configure(api_key=settings.GEMINI_API_KEY)
|
||||
# Initialize the specific model we want to use
|
||||
gemini_flash_client = genai.GenerativeModel(
|
||||
model_name=settings.GEMINI_MODEL_NAME,
|
||||
generation_config=genai.types.GenerationConfig(
|
||||
@ -32,18 +27,15 @@ try:
|
||||
)
|
||||
logger.info(f"Gemini AI client initialized successfully for model '{settings.GEMINI_MODEL_NAME}'.")
|
||||
else:
|
||||
# Store error if API key is missing
|
||||
gemini_initialization_error = "GEMINI_API_KEY not configured. Gemini client not initialized."
|
||||
logger.error(gemini_initialization_error)
|
||||
|
||||
except Exception as e:
|
||||
# Catch any other unexpected errors during initialization
|
||||
gemini_initialization_error = f"Failed to initialize Gemini AI client: {e}"
|
||||
logger.exception(gemini_initialization_error) # Log full traceback
|
||||
gemini_flash_client = None # Ensure client is None on error
|
||||
logger.exception(gemini_initialization_error)
|
||||
gemini_flash_client = None
|
||||
|
||||
|
||||
# --- Function to get the client (optional, allows checking error) ---
|
||||
def get_gemini_client():
|
||||
"""
|
||||
Returns the initialized Gemini client instance.
|
||||
@ -52,23 +44,172 @@ def get_gemini_client():
|
||||
if gemini_initialization_error:
|
||||
raise OCRServiceConfigError()
|
||||
if gemini_flash_client is None:
|
||||
# This case should ideally be covered by the check above, but as a safeguard:
|
||||
raise OCRServiceConfigError()
|
||||
return gemini_flash_client
|
||||
|
||||
# Define the prompt as a constant
|
||||
OCR_ITEM_EXTRACTION_PROMPT = """
|
||||
Extract the shopping list items from this image.
|
||||
List each distinct item on a new line.
|
||||
Ignore prices, quantities, store names, discounts, taxes, totals, and other non-item text.
|
||||
Focus only on the names of the products or items to be purchased.
|
||||
If the image does not appear to be a shopping list or receipt, state that clearly.
|
||||
Example output for a grocery list:
|
||||
Milk
|
||||
Eggs
|
||||
Bread
|
||||
Apples
|
||||
Organic Bananas
|
||||
**ROLE & GOAL**
|
||||
|
||||
You are an expert AI assistant specializing in Optical Character Recognition (OCR) and structured data extraction. Your primary function is to act as a "Shopping List Digitizer."
|
||||
|
||||
Your goal is to meticulously analyze the provided image of a shopping list, which is likely handwritten, and convert it into a structured, machine-readable JSON format. You must be accurate, infer context where necessary, and handle the inherent ambiguities of handwriting and informal list-making.
|
||||
|
||||
**INPUT**
|
||||
|
||||
You will receive a single image (`[Image]`). This image contains a shopping list. It may be:
|
||||
* Neatly written or very messy.
|
||||
* On lined paper, a whiteboard, a napkin, or a dedicated notepad.
|
||||
* Containing doodles, stains, or other visual noise.
|
||||
* Using various formats (bullet points, numbered lists, columns, simple line breaks).
|
||||
* could be in English or in German.
|
||||
|
||||
**CORE TASK: STEP-BY-STEP ANALYSIS**
|
||||
|
||||
Follow these steps precisely:
|
||||
|
||||
1. **Initial Image Analysis & OCR:**
|
||||
* Perform an advanced OCR scan on the entire image to transcribe all visible text.
|
||||
* Pay close attention to the spatial layout. Identify headings, columns, and line items. Note which text elements appear to be grouped together.
|
||||
|
||||
2. **Item Identification & Filtering:**
|
||||
* Differentiate between actual list items and non-item elements.
|
||||
* **INCLUDE:** Items intended for purchase.
|
||||
* **EXCLUDE:** List titles (e.g., "GROCERIES," "Target List"), dates, doodles, unrelated notes, or stray marks. Capture the list title separately if one exists.
|
||||
|
||||
3. **Detailed Extraction for Each Item:**
|
||||
For every single item you identify, extract the following attributes. If an attribute is not present, use `null`.
|
||||
|
||||
* `item_name` (string): The primary name of the product.
|
||||
* **Standardize:** Normalize the name. (e.g., "B. Powder" -> "Baking Powder", "A. Juice" -> "Apple Juice").
|
||||
* **Contextual Guessing:** If a word is poorly written, use the context of a shopping list to make an educated guess. (e.g., "Ciffee" is almost certainly "Coffee").
|
||||
|
||||
* `quantity` (number or string): The amount needed.
|
||||
* If a number is present (e.g., "**2** milks"), extract the number `2`.
|
||||
* If it's a word (e.g., "**a dozen** eggs"), extract the string `"a dozen"`.
|
||||
* If no quantity is specified (e.g., "Bread"), infer a default quantity of `1`.
|
||||
|
||||
* `unit` (string): The unit of measurement or packaging.
|
||||
* Examples: "kg", "lbs", "liters", "gallons", "box", "can", "bag", "bunch".
|
||||
* Infer where possible (e.g., for "2 Milks," the unit could be inferred as "cartons" or "gallons" depending on regional context, but it's safer to leave it `null` if not explicitly stated).
|
||||
|
||||
* `notes` (string): Any additional descriptive text.
|
||||
* Examples: "low-sodium," "organic," "brand name (Tide)," "for the cake," "get the ripe ones."
|
||||
|
||||
* `category` (string): Infer a logical category for the item.
|
||||
* Use common grocery store categories: `Produce`, `Dairy & Eggs`, `Meat & Seafood`, `Pantry`, `Frozen`, `Bakery`, `Beverages`, `Household`, `Personal Care`.
|
||||
* If the list itself has category headings (e.g., a "DAIRY" section), use those first.
|
||||
|
||||
* `original_text` (string): Provide the exact, unaltered text that your OCR transcribed for this entire line item. This is crucial for verification.
|
||||
|
||||
* `is_crossed_out` (boolean): Set to `true` if the item is struck through, crossed out, or clearly marked as completed. Otherwise, set to `false`.
|
||||
|
||||
**HANDLING AMBIGUITIES AND EDGE CASES**
|
||||
|
||||
* **Illegible Text:** If a line or word is completely unreadable, set `item_name` to `"UNREADABLE"` and place the garbled OCR attempt in the `original_text` field.
|
||||
* **Abbreviations:** Expand common shopping list abbreviations (e.g., "OJ" -> "Orange Juice", "TP" -> "Toilet Paper", "AVOs" -> "Avocados", "G. Beef" -> "Ground Beef").
|
||||
* **Implicit Items:** If a line is vague like "Snacks for kids," list it as is. Do not invent specific items.
|
||||
* **Multi-item Lines:** If a line contains multiple items (e.g., "Onions, Garlic, Ginger"), split them into separate item objects.
|
||||
|
||||
**OUTPUT FORMAT**
|
||||
|
||||
Your final output MUST be a single JSON object with the following structure. Do not include any explanatory text before or after the JSON block.
|
||||
|
||||
```json
|
||||
{
|
||||
"list_title": "string or null",
|
||||
"items": [
|
||||
{
|
||||
"item_name": "string",
|
||||
"quantity": "number or string",
|
||||
"unit": "string or null",
|
||||
"category": "string",
|
||||
"notes": "string or null",
|
||||
"original_text": "string",
|
||||
"is_crossed_out": "boolean"
|
||||
}
|
||||
],
|
||||
"summary": {
|
||||
"total_items": "integer",
|
||||
"unread_items": "integer",
|
||||
"crossed_out_items": "integer"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**EXAMPLE WALKTHROUGH**
|
||||
|
||||
* **IF THE IMAGE SHOWS:** A crumpled sticky note with the title "Stuff for tonight" and the items:
|
||||
* `2x Chicken Breasts`
|
||||
* `~~Baguette~~` (this item is crossed out)
|
||||
* `Salad mix (bag)`
|
||||
* `Tomatos` (misspelled)
|
||||
* `Choc Ice Cream`
|
||||
|
||||
* **YOUR JSON OUTPUT SHOULD BE:**
|
||||
|
||||
```json
|
||||
{
|
||||
"list_title": "Stuff for tonight",
|
||||
"items": [
|
||||
{
|
||||
"item_name": "Chicken Breasts",
|
||||
"quantity": 2,
|
||||
"unit": null,
|
||||
"category": "Meat & Seafood",
|
||||
"notes": null,
|
||||
"original_text": "2x Chicken Breasts",
|
||||
"is_crossed_out": false
|
||||
},
|
||||
{
|
||||
"item_name": "Baguette",
|
||||
"quantity": 1,
|
||||
"unit": null,
|
||||
"category": "Bakery",
|
||||
"notes": null,
|
||||
"original_text": "Baguette",
|
||||
"is_crossed_out": true
|
||||
},
|
||||
{
|
||||
"item_name": "Salad Mix",
|
||||
"quantity": 1,
|
||||
"unit": "bag",
|
||||
"category": "Produce",
|
||||
"notes": null,
|
||||
"original_text": "Salad mix (bag)",
|
||||
"is_crossed_out": false
|
||||
},
|
||||
{
|
||||
"item_name": "Tomatoes",
|
||||
"quantity": 1,
|
||||
"unit": null,
|
||||
"category": "Produce",
|
||||
"notes": null,
|
||||
"original_text": "Tomatos",
|
||||
"is_crossed_out": false
|
||||
},
|
||||
{
|
||||
"item_name": "Chocolate Ice Cream",
|
||||
"quantity": 1,
|
||||
"unit": null,
|
||||
"category": "Frozen",
|
||||
"notes": null,
|
||||
"original_text": "Choc Ice Cream",
|
||||
"is_crossed_out": false
|
||||
}
|
||||
],
|
||||
"summary": {
|
||||
"total_items": 5,
|
||||
"unread_items": 0,
|
||||
"crossed_out_items": 1
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**FINAL INSTRUCTION**
|
||||
|
||||
If the image provided is not a shopping list or is completely blank/unintelligible, respond with a JSON object where the `items` array is empty and add a note in the `list_title` field, such as "Image does not appear to be a shopping list."
|
||||
|
||||
Now, analyze the provided image and generate the JSON output.
|
||||
"""
|
||||
|
||||
async def extract_items_from_image_gemini(image_bytes: bytes, mime_type: str = "image/jpeg") -> List[str]:
|
||||
@ -92,29 +233,22 @@ async def extract_items_from_image_gemini(image_bytes: bytes, mime_type: str = "
|
||||
try:
|
||||
client = get_gemini_client() # Raises OCRServiceConfigError if not initialized
|
||||
|
||||
# Prepare image part for multimodal input
|
||||
image_part = {
|
||||
"mime_type": mime_type,
|
||||
"data": image_bytes
|
||||
}
|
||||
|
||||
# Prepare the full prompt content
|
||||
prompt_parts = [
|
||||
settings.OCR_ITEM_EXTRACTION_PROMPT, # Text prompt first
|
||||
image_part # Then the image
|
||||
settings.OCR_ITEM_EXTRACTION_PROMPT,
|
||||
image_part
|
||||
]
|
||||
|
||||
logger.info("Sending image to Gemini for item extraction...")
|
||||
|
||||
# Make the API call
|
||||
# Use generate_content_async for async FastAPI
|
||||
response = await client.generate_content_async(prompt_parts)
|
||||
|
||||
# --- Process the response ---
|
||||
# Check for safety blocks or lack of content
|
||||
if not response.candidates or not response.candidates[0].content.parts:
|
||||
logger.warning("Gemini response blocked or empty.", extra={"response": response})
|
||||
# Check finish_reason if available
|
||||
finish_reason = response.candidates[0].finish_reason if response.candidates else 'UNKNOWN'
|
||||
safety_ratings = response.candidates[0].safety_ratings if response.candidates else 'N/A'
|
||||
if finish_reason == 'SAFETY':
|
||||
@ -122,18 +256,13 @@ async def extract_items_from_image_gemini(image_bytes: bytes, mime_type: str = "
|
||||
else:
|
||||
raise OCRUnexpectedError()
|
||||
|
||||
# Extract text - assumes the first part of the first candidate is the text response
|
||||
raw_text = response.text # response.text is a shortcut for response.candidates[0].content.parts[0].text
|
||||
raw_text = response.text
|
||||
logger.info("Received raw text from Gemini.")
|
||||
# logger.debug(f"Gemini Raw Text:\n{raw_text}") # Optional: Log full response text
|
||||
|
||||
# Parse the text response
|
||||
items = []
|
||||
for line in raw_text.splitlines(): # Split by newline
|
||||
cleaned_line = line.strip() # Remove leading/trailing whitespace
|
||||
# Basic filtering: ignore empty lines and potential non-item lines
|
||||
if cleaned_line and len(cleaned_line) > 1: # Ignore very short lines too?
|
||||
# Add more sophisticated filtering if needed (e.g., regex, keyword check)
|
||||
for line in raw_text.splitlines():
|
||||
cleaned_line = line.strip()
|
||||
if cleaned_line and len(cleaned_line) > 1:
|
||||
items.append(cleaned_line)
|
||||
|
||||
logger.info(f"Extracted {len(items)} potential items.")
|
||||
@ -145,12 +274,9 @@ async def extract_items_from_image_gemini(image_bytes: bytes, mime_type: str = "
|
||||
raise OCRQuotaExceededError()
|
||||
raise OCRServiceUnavailableError()
|
||||
except (OCRServiceConfigError, OCRQuotaExceededError, OCRServiceUnavailableError, OCRProcessingError, OCRUnexpectedError):
|
||||
# Re-raise specific OCR exceptions
|
||||
raise
|
||||
except Exception as e:
|
||||
# Catch other unexpected errors during generation or processing
|
||||
logger.error(f"Unexpected error during Gemini item extraction: {e}", exc_info=True)
|
||||
# Wrap in a custom exception
|
||||
raise OCRUnexpectedError()
|
||||
|
||||
class GeminiOCRService:
|
||||
@ -186,27 +312,22 @@ class GeminiOCRService:
|
||||
OCRUnexpectedError: For any other unexpected errors.
|
||||
"""
|
||||
try:
|
||||
# Create image part
|
||||
image_parts = [{"mime_type": mime_type, "data": image_data}]
|
||||
|
||||
# Generate content
|
||||
response = await self.model.generate_content_async(
|
||||
contents=[settings.OCR_ITEM_EXTRACTION_PROMPT, *image_parts]
|
||||
)
|
||||
|
||||
# Process response
|
||||
if not response.text:
|
||||
logger.warning("Gemini response is empty")
|
||||
raise OCRUnexpectedError()
|
||||
|
||||
# Check for safety blocks
|
||||
if hasattr(response, 'candidates') and response.candidates and hasattr(response.candidates[0], 'finish_reason'):
|
||||
finish_reason = response.candidates[0].finish_reason
|
||||
if finish_reason == 'SAFETY':
|
||||
safety_ratings = response.candidates[0].safety_ratings if hasattr(response.candidates[0], 'safety_ratings') else 'N/A'
|
||||
raise OCRProcessingError(f"Gemini response blocked due to safety settings. Ratings: {safety_ratings}")
|
||||
|
||||
# Split response into lines and clean up
|
||||
items = []
|
||||
for line in response.text.splitlines():
|
||||
cleaned_line = line.strip()
|
||||
@ -222,7 +343,6 @@ class GeminiOCRService:
|
||||
raise OCRQuotaExceededError()
|
||||
raise OCRServiceUnavailableError()
|
||||
except (OCRServiceConfigError, OCRQuotaExceededError, OCRServiceUnavailableError, OCRProcessingError, OCRUnexpectedError):
|
||||
# Re-raise specific OCR exceptions
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error during Gemini item extraction: {e}", exc_info=True)
|
||||
|
30
be/app/core/logging_utils.py
Normal file
30
be/app/core/logging_utils.py
Normal file
@ -0,0 +1,30 @@
|
||||
import logging
|
||||
import re
|
||||
|
||||
EMAIL_RE = re.compile(r"[\w\.-]+@[\w\.-]+", re.IGNORECASE)
|
||||
|
||||
class PiiRedactionFilter(logging.Filter):
|
||||
"""Filter that redacts email addresses and long numeric IDs from log records."""
|
||||
|
||||
def filter(self, record: logging.LogRecord) -> bool:
|
||||
if isinstance(record.msg, dict):
|
||||
# For structured logs we mutate in-place.
|
||||
record.msg = self._redact_dict(record.msg)
|
||||
elif isinstance(record.msg, str):
|
||||
record.msg = self._redact_text(record.msg)
|
||||
return True # Always log, but redacted
|
||||
|
||||
def _redact_text(self, text: str) -> str:
|
||||
text = EMAIL_RE.sub("<redacted-email>", text)
|
||||
# Redact numeric IDs longer than 6 digits
|
||||
text = re.sub(r"(?<!\d)(\d{7,})(?!\d)", "<id>", text)
|
||||
return text
|
||||
|
||||
def _redact_dict(self, data):
|
||||
redacted = {}
|
||||
for k, v in data.items():
|
||||
if isinstance(v, str):
|
||||
redacted[k] = self._redact_text(v)
|
||||
else:
|
||||
redacted[k] = v
|
||||
return redacted
|
43
be/app/core/middleware.py
Normal file
43
be/app/core/middleware.py
Normal file
@ -0,0 +1,43 @@
|
||||
from starlette.middleware.base import BaseHTTPMiddleware
|
||||
from starlette.requests import Request
|
||||
from starlette.responses import Response
|
||||
import time
|
||||
import logging
|
||||
import uuid
|
||||
|
||||
logger = logging.getLogger("structured")
|
||||
|
||||
class RequestContextMiddleware(BaseHTTPMiddleware):
|
||||
"""Adds a unique request ID and logs request / response details."""
|
||||
|
||||
async def dispatch(self, request: Request, call_next):
|
||||
request_id = str(uuid.uuid4())
|
||||
start_time = time.time()
|
||||
# Attach id to request state for downstream handlers
|
||||
request.state.request_id = request_id
|
||||
|
||||
logger.info(
|
||||
{
|
||||
"event": "request_start",
|
||||
"request_id": request_id,
|
||||
"method": request.method,
|
||||
"path": request.url.path,
|
||||
"client": request.client.host if request.client else None,
|
||||
}
|
||||
)
|
||||
|
||||
response: Response = await call_next(request)
|
||||
|
||||
process_time = (time.time() - start_time) * 1000
|
||||
logger.info(
|
||||
{
|
||||
"event": "request_end",
|
||||
"request_id": request_id,
|
||||
"status_code": response.status_code,
|
||||
"duration_ms": round(process_time, 2),
|
||||
}
|
||||
)
|
||||
|
||||
# Propagate request id header for tracing
|
||||
response.headers["X-Request-ID"] = request_id
|
||||
return response
|
39
be/app/core/rate_limiter.py
Normal file
39
be/app/core/rate_limiter.py
Normal file
@ -0,0 +1,39 @@
|
||||
import time, logging, asyncio
|
||||
from starlette.middleware.base import BaseHTTPMiddleware
|
||||
from starlette.requests import Request
|
||||
from starlette.responses import Response, JSONResponse
|
||||
from app.core.redis import redis_pool
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
RATE_LIMIT_PATHS = {
|
||||
"/api/v1/auth/jwt/login": (5, 60), # 5 requests per 60 seconds per IP
|
||||
}
|
||||
|
||||
class RateLimitMiddleware(BaseHTTPMiddleware):
|
||||
async def dispatch(self, request: Request, call_next):
|
||||
path = request.url.path
|
||||
limit_cfg = RATE_LIMIT_PATHS.get(path)
|
||||
if not limit_cfg:
|
||||
return await call_next(request)
|
||||
|
||||
max_requests, window = limit_cfg
|
||||
client_ip = request.client.host if request.client else "unknown"
|
||||
key = f"rate:{path}:{client_ip}"
|
||||
try:
|
||||
current = await redis_pool.get(key)
|
||||
current_int = int(current) if current else 0
|
||||
if current_int >= max_requests:
|
||||
logger.warning(f"Rate limit exceeded for {client_ip} on {path}")
|
||||
return JSONResponse(status_code=429, content={"detail": "Too Many Requests"})
|
||||
# increment
|
||||
pipe = redis_pool.pipeline()
|
||||
pipe.incr(key, 1)
|
||||
pipe.expire(key, window)
|
||||
await pipe.execute()
|
||||
except Exception as e:
|
||||
logger.error(f"Rate limiting error: {e}")
|
||||
# Fail-open if redis unavailable
|
||||
pass
|
||||
|
||||
return await call_next(request)
|
7
be/app/core/redis.py
Normal file
7
be/app/core/redis.py
Normal file
@ -0,0 +1,7 @@
|
||||
import redis.asyncio as redis
|
||||
from app.config import settings
|
||||
|
||||
redis_pool = redis.from_url(settings.REDIS_URL, encoding="utf-8", decode_responses=True)
|
||||
|
||||
async def get_redis():
|
||||
return redis_pool
|
@ -1,8 +1,7 @@
|
||||
from apscheduler.schedulers.asyncio import AsyncIOScheduler
|
||||
from apscheduler.jobstores.sqlalchemy import SQLAlchemyJobStore
|
||||
from apscheduler.jobstores.memory import MemoryJobStore
|
||||
from apscheduler.executors.pool import ThreadPoolExecutor
|
||||
from apscheduler.triggers.cron import CronTrigger
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
|
||||
from app.config import settings
|
||||
from app.jobs.recurring_expenses import generate_recurring_expenses
|
||||
from app.db.session import async_session
|
||||
@ -10,17 +9,13 @@ import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Convert async database URL to sync URL for APScheduler
|
||||
# Replace postgresql+asyncpg:// with postgresql://
|
||||
sync_db_url = settings.DATABASE_URL.replace('postgresql+asyncpg://', 'postgresql://')
|
||||
|
||||
# Configure the scheduler
|
||||
jobstores = {
|
||||
'default': SQLAlchemyJobStore(url=sync_db_url)
|
||||
'default': MemoryJobStore()
|
||||
}
|
||||
|
||||
# Run scheduled jobs on a separate small thread pool to keep event loop free
|
||||
executors = {
|
||||
'default': ThreadPoolExecutor(20)
|
||||
'default': ThreadPoolExecutor(5)
|
||||
}
|
||||
|
||||
job_defaults = {
|
||||
@ -36,7 +31,10 @@ scheduler = AsyncIOScheduler(
|
||||
)
|
||||
|
||||
async def run_recurring_expenses_job():
|
||||
"""Wrapper function to run the recurring expenses job with a database session."""
|
||||
"""Wrapper function to run the recurring expenses job with a database session.
|
||||
|
||||
This function is used to generate recurring expenses for the user.
|
||||
"""
|
||||
try:
|
||||
async with async_session() as session:
|
||||
await generate_recurring_expenses(session)
|
||||
@ -47,7 +45,6 @@ async def run_recurring_expenses_job():
|
||||
def init_scheduler():
|
||||
"""Initialize and start the scheduler."""
|
||||
try:
|
||||
# Add the recurring expenses job
|
||||
scheduler.add_job(
|
||||
run_recurring_expenses_job,
|
||||
trigger=CronTrigger(hour=0, minute=0), # Run at midnight UTC
|
||||
@ -56,7 +53,6 @@ def init_scheduler():
|
||||
replace_existing=True
|
||||
)
|
||||
|
||||
# Start the scheduler
|
||||
scheduler.start()
|
||||
logger.info("Scheduler started successfully")
|
||||
except Exception as e:
|
||||
|
@ -1,20 +1,8 @@
|
||||
# app/core/security.py
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Any, Union, Optional
|
||||
|
||||
from jose import JWTError, jwt
|
||||
from passlib.context import CryptContext
|
||||
from datetime import datetime, timedelta
|
||||
from jose import jwt
|
||||
from typing import Optional
|
||||
|
||||
from app.config import settings # Import settings from config
|
||||
|
||||
# --- Password Hashing ---
|
||||
# These functions are used for password hashing and verification
|
||||
# They complement FastAPI-Users but provide direct access to the underlying password functionality
|
||||
# when needed outside of the FastAPI-Users authentication flow.
|
||||
|
||||
# Configure passlib context
|
||||
# Using bcrypt as the default hashing scheme
|
||||
# 'deprecated="auto"' will automatically upgrade hashes if needed on verification
|
||||
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
|
||||
|
||||
def verify_password(plain_password: str, hashed_password: str) -> bool:
|
||||
@ -33,7 +21,6 @@ def verify_password(plain_password: str, hashed_password: str) -> bool:
|
||||
try:
|
||||
return pwd_context.verify(plain_password, hashed_password)
|
||||
except Exception:
|
||||
# Handle potential errors during verification (e.g., invalid hash format)
|
||||
return False
|
||||
|
||||
def hash_password(password: str) -> str:
|
||||
@ -50,24 +37,38 @@ def hash_password(password: str) -> str:
|
||||
"""
|
||||
return pwd_context.hash(password)
|
||||
|
||||
# Alias for compatibility with guest.py
|
||||
def get_password_hash(password: str) -> str:
|
||||
"""
|
||||
Alias for hash_password function for backward compatibility.
|
||||
|
||||
# --- JSON Web Tokens (JWT) ---
|
||||
# FastAPI-Users now handles all JWT token creation and validation.
|
||||
# The code below is commented out because FastAPI-Users provides these features.
|
||||
# It's kept for reference in case a custom implementation is needed later.
|
||||
Args:
|
||||
password: The plain text password to hash.
|
||||
|
||||
# Example of a potential future implementation:
|
||||
# def get_subject_from_token(token: str) -> Optional[str]:
|
||||
# """
|
||||
# Extract the subject (user ID) from a JWT token.
|
||||
# This would be used if we need to validate tokens outside of FastAPI-Users flow.
|
||||
# For now, use fastapi_users.current_user dependency instead.
|
||||
# """
|
||||
# # This would need to use FastAPI-Users' token verification if ever implemented
|
||||
# # For example, by decoding the token using the strategy from the auth backend
|
||||
# try:
|
||||
# payload = jwt.decode(token, settings.SECRET_KEY, algorithms=[settings.ALGORITHM])
|
||||
# return payload.get("sub")
|
||||
# except JWTError:
|
||||
# return None
|
||||
# return None
|
||||
Returns:
|
||||
The resulting hash string.
|
||||
"""
|
||||
return hash_password(password)
|
||||
|
||||
def create_access_token(data: dict, expires_delta: Optional[timedelta] = None) -> str:
|
||||
"""
|
||||
Create a JWT access token.
|
||||
|
||||
Args:
|
||||
data: The data to encode in the token (typically {"sub": email}).
|
||||
expires_delta: Optional custom expiration time.
|
||||
|
||||
Returns:
|
||||
The encoded JWT token.
|
||||
"""
|
||||
from app.config import settings
|
||||
|
||||
to_encode = data.copy()
|
||||
if expires_delta:
|
||||
expire = datetime.utcnow() + expires_delta
|
||||
else:
|
||||
expire = datetime.utcnow() + timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES)
|
||||
|
||||
to_encode.update({"exp": expire})
|
||||
encoded_jwt = jwt.encode(to_encode, settings.SECRET_KEY, algorithm="HS256")
|
||||
return encoded_jwt
|
76
be/app/crud/audit.py
Normal file
76
be/app/crud/audit.py
Normal file
@ -0,0 +1,76 @@
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy import union_all, or_
|
||||
from typing import List, Optional
|
||||
from app.models import FinancialAuditLog, Base, User, Group, Expense, Settlement
|
||||
from app.schemas.audit import FinancialAuditLogCreate
|
||||
|
||||
async def create_financial_audit_log(
|
||||
db: AsyncSession,
|
||||
*,
|
||||
user_id: int | None,
|
||||
action_type: str,
|
||||
entity: Base,
|
||||
details: dict | None = None
|
||||
) -> FinancialAuditLog:
|
||||
log_entry_data = FinancialAuditLogCreate(
|
||||
user_id=user_id,
|
||||
action_type=action_type,
|
||||
entity_type=entity.__class__.__name__,
|
||||
entity_id=entity.id,
|
||||
details=details
|
||||
)
|
||||
log_entry = FinancialAuditLog(**log_entry_data.dict())
|
||||
db.add(log_entry)
|
||||
await db.flush()
|
||||
return log_entry
|
||||
|
||||
async def get_financial_audit_logs_for_group(db: AsyncSession, *, group_id: int, skip: int = 0, limit: int = 100) -> List[FinancialAuditLog]:
|
||||
"""
|
||||
Get financial audit logs for all entities that belong to a specific group.
|
||||
This includes Expenses and Settlements that are linked to the group.
|
||||
"""
|
||||
# Get all expense IDs for this group
|
||||
expense_ids_query = select(Expense.id).where(Expense.group_id == group_id)
|
||||
expense_result = await db.execute(expense_ids_query)
|
||||
expense_ids = [row[0] for row in expense_result.fetchall()]
|
||||
|
||||
# Get all settlement IDs for this group
|
||||
settlement_ids_query = select(Settlement.id).where(Settlement.group_id == group_id)
|
||||
settlement_result = await db.execute(settlement_ids_query)
|
||||
settlement_ids = [row[0] for row in settlement_result.fetchall()]
|
||||
|
||||
# Build conditions for the audit log query
|
||||
conditions = []
|
||||
if expense_ids:
|
||||
conditions.append(
|
||||
(FinancialAuditLog.entity_type == 'Expense') &
|
||||
(FinancialAuditLog.entity_id.in_(expense_ids))
|
||||
)
|
||||
if settlement_ids:
|
||||
conditions.append(
|
||||
(FinancialAuditLog.entity_type == 'Settlement') &
|
||||
(FinancialAuditLog.entity_id.in_(settlement_ids))
|
||||
)
|
||||
|
||||
# If no entities exist for this group, return empty list
|
||||
if not conditions:
|
||||
return []
|
||||
|
||||
# Query audit logs for all relevant entities
|
||||
query = select(FinancialAuditLog).where(
|
||||
or_(*conditions)
|
||||
).order_by(FinancialAuditLog.timestamp.desc()).offset(skip).limit(limit)
|
||||
|
||||
result = await db.execute(query)
|
||||
return result.scalars().all()
|
||||
|
||||
|
||||
async def get_financial_audit_logs_for_user(db: AsyncSession, *, user_id: int, skip: int = 0, limit: int = 100) -> List[FinancialAuditLog]:
|
||||
result = await db.execute(
|
||||
select(FinancialAuditLog)
|
||||
.where(FinancialAuditLog.user_id == user_id)
|
||||
.order_by(FinancialAuditLog.timestamp.desc())
|
||||
.offset(skip).limit(limit)
|
||||
)
|
||||
return result.scalars().all()
|
38
be/app/crud/category.py
Normal file
38
be/app/crud/category.py
Normal file
@ -0,0 +1,38 @@
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from typing import List, Optional
|
||||
|
||||
from app.models import Category
|
||||
from app.schemas.category import CategoryCreate, CategoryUpdate
|
||||
|
||||
async def create_category(db: AsyncSession, category_in: CategoryCreate, user_id: int, group_id: Optional[int] = None) -> Category:
|
||||
db_category = Category(**category_in.dict(), user_id=user_id, group_id=group_id)
|
||||
db.add(db_category)
|
||||
await db.commit()
|
||||
await db.refresh(db_category)
|
||||
return db_category
|
||||
|
||||
async def get_user_categories(db: AsyncSession, user_id: int) -> List[Category]:
|
||||
result = await db.execute(select(Category).where(Category.user_id == user_id))
|
||||
return result.scalars().all()
|
||||
|
||||
async def get_group_categories(db: AsyncSession, group_id: int) -> List[Category]:
|
||||
result = await db.execute(select(Category).where(Category.group_id == group_id))
|
||||
return result.scalars().all()
|
||||
|
||||
async def get_category(db: AsyncSession, category_id: int) -> Optional[Category]:
|
||||
return await db.get(Category, category_id)
|
||||
|
||||
async def update_category(db: AsyncSession, db_category: Category, category_in: CategoryUpdate) -> Category:
|
||||
update_data = category_in.dict(exclude_unset=True)
|
||||
for key, value in update_data.items():
|
||||
setattr(db_category, key, value)
|
||||
db.add(db_category)
|
||||
await db.commit()
|
||||
await db.refresh(db_category)
|
||||
return db_category
|
||||
|
||||
async def delete_category(db: AsyncSession, db_category: Category):
|
||||
await db.delete(db_category)
|
||||
await db.commit()
|
||||
return db_category
|
@ -1,15 +1,16 @@
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload
|
||||
from sqlalchemy import union_all
|
||||
from sqlalchemy.orm import selectinload, subqueryload
|
||||
from sqlalchemy import union_all, and_, or_, delete
|
||||
from typing import List, Optional
|
||||
import logging
|
||||
from datetime import date, datetime
|
||||
|
||||
from app.models import Chore, Group, User, ChoreAssignment, ChoreFrequencyEnum, ChoreTypeEnum, UserGroup
|
||||
from app.models import Chore, Group, User, ChoreAssignment, ChoreFrequencyEnum, ChoreTypeEnum, UserGroup, ChoreHistoryEventTypeEnum, UserRoleEnum, ChoreHistory, ChoreAssignmentHistory
|
||||
from app.schemas.chore import ChoreCreate, ChoreUpdate, ChoreAssignmentCreate, ChoreAssignmentUpdate
|
||||
from app.core.chore_utils import calculate_next_due_date
|
||||
from app.crud.group import get_group_by_id, is_user_member
|
||||
from app.crud.group import get_group_by_id, is_user_member, get_user_role_in_group
|
||||
from app.crud.history import create_chore_history_entry, create_assignment_history_entry
|
||||
from app.core.exceptions import ChoreNotFoundError, GroupNotFoundError, PermissionDeniedError, DatabaseIntegrityError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -17,7 +18,6 @@ logger = logging.getLogger(__name__)
|
||||
async def get_all_user_chores(db: AsyncSession, user_id: int) -> List[Chore]:
|
||||
"""Gets all chores (personal and group) for a user in optimized queries."""
|
||||
|
||||
# Get personal chores query
|
||||
personal_chores_query = (
|
||||
select(Chore)
|
||||
.where(
|
||||
@ -26,7 +26,6 @@ async def get_all_user_chores(db: AsyncSession, user_id: int) -> List[Chore]:
|
||||
)
|
||||
)
|
||||
|
||||
# Get user's group IDs first
|
||||
user_groups_result = await db.execute(
|
||||
select(UserGroup.group_id).where(UserGroup.user_id == user_id)
|
||||
)
|
||||
@ -34,18 +33,19 @@ async def get_all_user_chores(db: AsyncSession, user_id: int) -> List[Chore]:
|
||||
|
||||
all_chores = []
|
||||
|
||||
# Execute personal chores query
|
||||
personal_result = await db.execute(
|
||||
personal_chores_query
|
||||
.options(
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user)
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.history),
|
||||
selectinload(Chore.history),
|
||||
selectinload(Chore.child_chores)
|
||||
)
|
||||
.order_by(Chore.next_due_date, Chore.name)
|
||||
)
|
||||
all_chores.extend(personal_result.scalars().all())
|
||||
|
||||
# If user has groups, get all group chores in one query
|
||||
if user_group_ids:
|
||||
group_chores_result = await db.execute(
|
||||
select(Chore)
|
||||
@ -56,7 +56,10 @@ async def get_all_user_chores(db: AsyncSession, user_id: int) -> List[Chore]:
|
||||
.options(
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.group),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user)
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.history),
|
||||
selectinload(Chore.history),
|
||||
selectinload(Chore.child_chores)
|
||||
)
|
||||
.order_by(Chore.next_due_date, Chore.name)
|
||||
)
|
||||
@ -67,47 +70,91 @@ async def get_all_user_chores(db: AsyncSession, user_id: int) -> List[Chore]:
|
||||
async def create_chore(
|
||||
db: AsyncSession,
|
||||
chore_in: ChoreCreate,
|
||||
user_id: int,
|
||||
group_id: Optional[int] = None
|
||||
user_id: int
|
||||
) -> Chore:
|
||||
"""Creates a new chore, either personal or within a specific group."""
|
||||
# Use the transaction pattern from the FastAPI strategy
|
||||
"""Creates a new chore, and if specified, an assignment for it."""
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
# Validate chore type and group
|
||||
if chore_in.type == ChoreTypeEnum.group:
|
||||
if not group_id:
|
||||
if not chore_in.group_id:
|
||||
raise ValueError("group_id is required for group chores")
|
||||
# Validate group existence and user membership
|
||||
group = await get_group_by_id(db, group_id)
|
||||
group = await get_group_by_id(db, chore_in.group_id)
|
||||
if not group:
|
||||
raise GroupNotFoundError(group_id)
|
||||
if not await is_user_member(db, group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of group {group_id}")
|
||||
raise GroupNotFoundError(chore_in.group_id)
|
||||
if not await is_user_member(db, chore_in.group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of group {chore_in.group_id}")
|
||||
else: # personal chore
|
||||
if group_id:
|
||||
if chore_in.group_id:
|
||||
raise ValueError("group_id must be None for personal chores")
|
||||
|
||||
# Validate assigned user if provided
|
||||
if chore_in.assigned_to_user_id:
|
||||
if chore_in.type == ChoreTypeEnum.group:
|
||||
# For group chores, assigned user must be a member of the group
|
||||
if not await is_user_member(db, chore_in.group_id, chore_in.assigned_to_user_id):
|
||||
raise PermissionDeniedError(detail=f"Assigned user {chore_in.assigned_to_user_id} is not a member of group {chore_in.group_id}")
|
||||
else: # Personal chore
|
||||
# For personal chores, you can only assign it to yourself
|
||||
if chore_in.assigned_to_user_id != user_id:
|
||||
raise PermissionDeniedError(detail="Personal chores can only be assigned to the creator.")
|
||||
|
||||
assigned_user_id = chore_in.assigned_to_user_id
|
||||
chore_data = chore_in.model_dump(exclude_unset=True, exclude={'assigned_to_user_id'})
|
||||
|
||||
if 'parent_chore_id' in chore_data and chore_data['parent_chore_id']:
|
||||
parent_chore = await get_chore_by_id(db, chore_data['parent_chore_id'])
|
||||
if not parent_chore:
|
||||
raise ChoreNotFoundError(chore_data['parent_chore_id'])
|
||||
|
||||
db_chore = Chore(
|
||||
**chore_in.model_dump(exclude_unset=True, exclude={'group_id'}),
|
||||
group_id=group_id,
|
||||
**chore_data,
|
||||
created_by_id=user_id,
|
||||
)
|
||||
|
||||
# Specific check for custom frequency
|
||||
if chore_in.frequency == ChoreFrequencyEnum.custom and chore_in.custom_interval_days is None:
|
||||
raise ValueError("custom_interval_days must be set for custom frequency chores.")
|
||||
|
||||
db.add(db_chore)
|
||||
await db.flush() # Get the ID for the chore
|
||||
await db.flush()
|
||||
|
||||
# Create an assignment if a user was specified
|
||||
if assigned_user_id:
|
||||
assignment = ChoreAssignment(
|
||||
chore_id=db_chore.id,
|
||||
assigned_to_user_id=assigned_user_id,
|
||||
due_date=db_chore.next_due_date,
|
||||
is_complete=False
|
||||
)
|
||||
db.add(assignment)
|
||||
await db.flush() # Flush to get the assignment ID
|
||||
await create_assignment_history_entry(
|
||||
db,
|
||||
assignment_id=assignment.id,
|
||||
event_type=ChoreHistoryEventTypeEnum.ASSIGNED,
|
||||
changed_by_user_id=user_id,
|
||||
event_data={'assigned_to': assigned_user_id}
|
||||
)
|
||||
|
||||
await create_chore_history_entry(
|
||||
db,
|
||||
chore_id=db_chore.id,
|
||||
group_id=db_chore.group_id,
|
||||
changed_by_user_id=user_id,
|
||||
event_type=ChoreHistoryEventTypeEnum.CREATED,
|
||||
event_data={"chore_name": db_chore.name}
|
||||
)
|
||||
|
||||
try:
|
||||
# Load relationships for the response with eager loading
|
||||
result = await db.execute(
|
||||
select(Chore)
|
||||
.where(Chore.id == db_chore.id)
|
||||
.options(
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.group),
|
||||
selectinload(Chore.assignments)
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.history),
|
||||
selectinload(Chore.history),
|
||||
selectinload(Chore.child_chores)
|
||||
)
|
||||
)
|
||||
return result.scalar_one()
|
||||
@ -120,7 +167,7 @@ async def get_chore_by_id(db: AsyncSession, chore_id: int) -> Optional[Chore]:
|
||||
result = await db.execute(
|
||||
select(Chore)
|
||||
.where(Chore.id == chore_id)
|
||||
.options(selectinload(Chore.creator), selectinload(Chore.group), selectinload(Chore.assignments))
|
||||
.options(*get_chore_loader_options())
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
@ -150,10 +197,7 @@ async def get_personal_chores(
|
||||
Chore.created_by_id == user_id,
|
||||
Chore.type == ChoreTypeEnum.personal
|
||||
)
|
||||
.options(
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user)
|
||||
)
|
||||
.options(*get_chore_loader_options())
|
||||
.order_by(Chore.next_due_date, Chore.name)
|
||||
)
|
||||
return result.scalars().all()
|
||||
@ -173,10 +217,7 @@ async def get_chores_by_group_id(
|
||||
Chore.group_id == group_id,
|
||||
Chore.type == ChoreTypeEnum.group
|
||||
)
|
||||
.options(
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user)
|
||||
)
|
||||
.options(*get_chore_loader_options())
|
||||
.order_by(Chore.next_due_date, Chore.name)
|
||||
)
|
||||
return result.scalars().all()
|
||||
@ -194,29 +235,46 @@ async def update_chore(
|
||||
if not db_chore:
|
||||
raise ChoreNotFoundError(chore_id, group_id)
|
||||
|
||||
# Check permissions
|
||||
original_data = {field: getattr(db_chore, field) for field in chore_in.model_dump(exclude_unset=True)}
|
||||
|
||||
# Check permissions for current chore
|
||||
if db_chore.type == ChoreTypeEnum.group:
|
||||
if not group_id:
|
||||
raise ValueError("group_id is required for group chores")
|
||||
if not await is_user_member(db, group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of group {group_id}")
|
||||
if db_chore.group_id != group_id:
|
||||
raise ChoreNotFoundError(chore_id, group_id)
|
||||
else: # personal chore
|
||||
if group_id:
|
||||
raise ValueError("group_id must be None for personal chores")
|
||||
if not await is_user_member(db, db_chore.group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of chore's current group {db_chore.group_id}")
|
||||
else:
|
||||
if db_chore.created_by_id != user_id:
|
||||
raise PermissionDeniedError(detail="Only the creator can update personal chores")
|
||||
|
||||
update_data = chore_in.model_dump(exclude_unset=True)
|
||||
|
||||
# Handle type change
|
||||
# Handle group changes
|
||||
if 'group_id' in update_data:
|
||||
new_group_id = update_data['group_id']
|
||||
if new_group_id != db_chore.group_id:
|
||||
# Validate user has permission for the new group
|
||||
if new_group_id is not None:
|
||||
if not await is_user_member(db, new_group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of target group {new_group_id}")
|
||||
|
||||
# Handle type changes
|
||||
if 'type' in update_data:
|
||||
new_type = update_data['type']
|
||||
if new_type == ChoreTypeEnum.group and not group_id:
|
||||
# When changing to personal, always clear group_id regardless of what's in update_data
|
||||
if new_type == ChoreTypeEnum.personal:
|
||||
update_data['group_id'] = None
|
||||
else:
|
||||
# For group chores, use the provided group_id or keep the current one
|
||||
new_group_id = update_data.get('group_id', db_chore.group_id)
|
||||
if new_type == ChoreTypeEnum.group and new_group_id is None:
|
||||
raise ValueError("group_id is required for group chores")
|
||||
if new_type == ChoreTypeEnum.personal and group_id:
|
||||
raise ValueError("group_id must be None for personal chores")
|
||||
|
||||
if 'parent_chore_id' in update_data:
|
||||
if update_data['parent_chore_id']:
|
||||
parent_chore = await get_chore_by_id(db, update_data['parent_chore_id'])
|
||||
if not parent_chore:
|
||||
raise ChoreNotFoundError(update_data['parent_chore_id'])
|
||||
# Setting parent_chore_id to None is allowed
|
||||
setattr(db_chore, 'parent_chore_id', update_data['parent_chore_id'])
|
||||
|
||||
# Recalculate next_due_date if needed
|
||||
recalculate = False
|
||||
@ -245,16 +303,28 @@ async def update_chore(
|
||||
if db_chore.frequency == ChoreFrequencyEnum.custom and db_chore.custom_interval_days is None:
|
||||
raise ValueError("custom_interval_days must be set for custom frequency chores.")
|
||||
|
||||
changes = {}
|
||||
for field, old_value in original_data.items():
|
||||
new_value = getattr(db_chore, field)
|
||||
if old_value != new_value:
|
||||
changes[field] = {"old": str(old_value), "new": str(new_value)}
|
||||
|
||||
if changes:
|
||||
await create_chore_history_entry(
|
||||
db,
|
||||
chore_id=chore_id,
|
||||
group_id=db_chore.group_id,
|
||||
changed_by_user_id=user_id,
|
||||
event_type=ChoreHistoryEventTypeEnum.UPDATED,
|
||||
event_data=changes
|
||||
)
|
||||
|
||||
try:
|
||||
await db.flush() # Flush changes within the transaction
|
||||
await db.flush()
|
||||
result = await db.execute(
|
||||
select(Chore)
|
||||
.where(Chore.id == db_chore.id)
|
||||
.options(
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.group),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user)
|
||||
)
|
||||
.options(*get_chore_loader_options())
|
||||
)
|
||||
return result.scalar_one()
|
||||
except Exception as e:
|
||||
@ -273,7 +343,15 @@ async def delete_chore(
|
||||
if not db_chore:
|
||||
raise ChoreNotFoundError(chore_id, group_id)
|
||||
|
||||
# Check permissions
|
||||
await create_chore_history_entry(
|
||||
db,
|
||||
chore_id=chore_id,
|
||||
group_id=db_chore.group_id,
|
||||
changed_by_user_id=user_id,
|
||||
event_type=ChoreHistoryEventTypeEnum.DELETED,
|
||||
event_data={"chore_name": db_chore.name}
|
||||
)
|
||||
|
||||
if db_chore.type == ChoreTypeEnum.group:
|
||||
if not group_id:
|
||||
raise ValueError("group_id is required for group chores")
|
||||
@ -289,7 +367,7 @@ async def delete_chore(
|
||||
|
||||
try:
|
||||
await db.delete(db_chore)
|
||||
await db.flush() # Ensure deletion is processed within the transaction
|
||||
await db.flush()
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error deleting chore {chore_id}: {e}", exc_info=True)
|
||||
@ -304,35 +382,36 @@ async def create_chore_assignment(
|
||||
) -> ChoreAssignment:
|
||||
"""Creates a new chore assignment. User must be able to manage the chore."""
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
# Get the chore and validate permissions
|
||||
chore = await get_chore_by_id(db, assignment_in.chore_id)
|
||||
if not chore:
|
||||
raise ChoreNotFoundError(chore_id=assignment_in.chore_id)
|
||||
|
||||
# Check permissions to assign this chore
|
||||
if chore.type == ChoreTypeEnum.personal:
|
||||
if chore.created_by_id != user_id:
|
||||
raise PermissionDeniedError(detail="Only the creator can assign personal chores")
|
||||
else: # group chore
|
||||
if not await is_user_member(db, chore.group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of group {chore.group_id}")
|
||||
# For group chores, check if assignee is also a group member
|
||||
if not await is_user_member(db, chore.group_id, assignment_in.assigned_to_user_id):
|
||||
raise PermissionDeniedError(detail=f"Cannot assign chore to user {assignment_in.assigned_to_user_id} who is not a group member")
|
||||
|
||||
db_assignment = ChoreAssignment(**assignment_in.model_dump(exclude_unset=True))
|
||||
db.add(db_assignment)
|
||||
await db.flush() # Get the ID for the assignment
|
||||
await db.flush()
|
||||
|
||||
await create_assignment_history_entry(
|
||||
db,
|
||||
assignment_id=db_assignment.id,
|
||||
changed_by_user_id=user_id,
|
||||
event_type=ChoreHistoryEventTypeEnum.ASSIGNED,
|
||||
event_data={"assigned_to_user_id": db_assignment.assigned_to_user_id, "due_date": db_assignment.due_date.isoformat()}
|
||||
)
|
||||
|
||||
try:
|
||||
# Load relationships for the response
|
||||
result = await db.execute(
|
||||
select(ChoreAssignment)
|
||||
.where(ChoreAssignment.id == db_assignment.id)
|
||||
.options(
|
||||
selectinload(ChoreAssignment.chore).selectinload(Chore.creator),
|
||||
selectinload(ChoreAssignment.assigned_user)
|
||||
)
|
||||
.options(*get_assignment_loader_options())
|
||||
)
|
||||
return result.scalar_one()
|
||||
except Exception as e:
|
||||
@ -344,10 +423,7 @@ async def get_chore_assignment_by_id(db: AsyncSession, assignment_id: int) -> Op
|
||||
result = await db.execute(
|
||||
select(ChoreAssignment)
|
||||
.where(ChoreAssignment.id == assignment_id)
|
||||
.options(
|
||||
selectinload(ChoreAssignment.chore).selectinload(Chore.creator),
|
||||
selectinload(ChoreAssignment.assigned_user)
|
||||
)
|
||||
.options(*get_assignment_loader_options())
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
@ -362,10 +438,7 @@ async def get_user_assignments(
|
||||
if not include_completed:
|
||||
query = query.where(ChoreAssignment.is_complete == False)
|
||||
|
||||
query = query.options(
|
||||
selectinload(ChoreAssignment.chore).selectinload(Chore.creator),
|
||||
selectinload(ChoreAssignment.assigned_user)
|
||||
).order_by(ChoreAssignment.due_date, ChoreAssignment.id)
|
||||
query = query.options(*get_assignment_loader_options()).order_by(ChoreAssignment.due_date, ChoreAssignment.id)
|
||||
|
||||
result = await db.execute(query)
|
||||
return result.scalars().all()
|
||||
@ -380,21 +453,17 @@ async def get_chore_assignments(
|
||||
if not chore:
|
||||
raise ChoreNotFoundError(chore_id=chore_id)
|
||||
|
||||
# Check permissions
|
||||
if chore.type == ChoreTypeEnum.personal:
|
||||
if chore.created_by_id != user_id:
|
||||
raise PermissionDeniedError(detail="Can only view assignments for own personal chores")
|
||||
else: # group chore
|
||||
else:
|
||||
if not await is_user_member(db, chore.group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of group {chore.group_id}")
|
||||
|
||||
result = await db.execute(
|
||||
select(ChoreAssignment)
|
||||
.where(ChoreAssignment.chore_id == chore_id)
|
||||
.options(
|
||||
selectinload(ChoreAssignment.chore).selectinload(Chore.creator),
|
||||
selectinload(ChoreAssignment.assigned_user)
|
||||
)
|
||||
.options(*get_assignment_loader_options())
|
||||
.order_by(ChoreAssignment.due_date, ChoreAssignment.id)
|
||||
)
|
||||
return result.scalars().all()
|
||||
@ -405,72 +474,70 @@ async def update_chore_assignment(
|
||||
assignment_in: ChoreAssignmentUpdate,
|
||||
user_id: int
|
||||
) -> Optional[ChoreAssignment]:
|
||||
"""Updates a chore assignment. Only the assignee can mark it complete."""
|
||||
"""Updates a chore assignment, e.g., to mark it as complete."""
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
db_assignment = await get_chore_assignment_by_id(db, assignment_id)
|
||||
if not db_assignment:
|
||||
raise ChoreNotFoundError(assignment_id=assignment_id)
|
||||
return None
|
||||
|
||||
# Load the chore for permission checking
|
||||
chore = await get_chore_by_id(db, db_assignment.chore_id)
|
||||
if not chore:
|
||||
raise ChoreNotFoundError(chore_id=db_assignment.chore_id)
|
||||
# Permission Check: only assigned user or group owner can update
|
||||
is_allowed = db_assignment.assigned_to_user_id == user_id
|
||||
if not is_allowed and db_assignment.chore.group_id:
|
||||
user_role = await get_user_role_in_group(db, db_assignment.chore.group_id, user_id)
|
||||
is_allowed = user_role == UserRoleEnum.owner
|
||||
|
||||
# Check permissions - only assignee can complete, but chore managers can reschedule
|
||||
can_manage = False
|
||||
if chore.type == ChoreTypeEnum.personal:
|
||||
can_manage = chore.created_by_id == user_id
|
||||
else: # group chore
|
||||
can_manage = await is_user_member(db, chore.group_id, user_id)
|
||||
|
||||
can_complete = db_assignment.assigned_to_user_id == user_id
|
||||
if not is_allowed:
|
||||
raise PermissionDeniedError("You cannot update this chore assignment.")
|
||||
|
||||
original_status = db_assignment.is_complete
|
||||
update_data = assignment_in.model_dump(exclude_unset=True)
|
||||
|
||||
# Check specific permissions for different updates
|
||||
if 'is_complete' in update_data and not can_complete:
|
||||
raise PermissionDeniedError(detail="Only the assignee can mark assignments as complete")
|
||||
|
||||
if 'due_date' in update_data and not can_manage:
|
||||
raise PermissionDeniedError(detail="Only chore managers can reschedule assignments")
|
||||
|
||||
# Handle completion logic
|
||||
if 'is_complete' in update_data and update_data['is_complete']:
|
||||
if not db_assignment.is_complete: # Only if not already complete
|
||||
update_data['completed_at'] = datetime.utcnow()
|
||||
|
||||
# Update parent chore's last_completed_at and recalculate next_due_date
|
||||
chore.last_completed_at = update_data['completed_at']
|
||||
chore.next_due_date = calculate_next_due_date(
|
||||
current_due_date=chore.next_due_date,
|
||||
frequency=chore.frequency,
|
||||
custom_interval_days=chore.custom_interval_days,
|
||||
last_completed_date=chore.last_completed_at
|
||||
)
|
||||
elif 'is_complete' in update_data and not update_data['is_complete']:
|
||||
# If marking as incomplete, clear completed_at
|
||||
update_data['completed_at'] = None
|
||||
|
||||
# Apply updates
|
||||
for field, value in update_data.items():
|
||||
setattr(db_assignment, field, value)
|
||||
|
||||
try:
|
||||
await db.flush() # Flush changes within the transaction
|
||||
if 'is_complete' in update_data:
|
||||
new_status = update_data['is_complete']
|
||||
history_event = None
|
||||
if new_status and not original_status:
|
||||
db_assignment.completed_at = datetime.utcnow()
|
||||
history_event = ChoreHistoryEventTypeEnum.COMPLETED
|
||||
|
||||
# Load relationships for the response
|
||||
# Advance the next_due_date of the parent chore
|
||||
if db_assignment.chore and db_assignment.chore.frequency != ChoreFrequencyEnum.one_time:
|
||||
db_assignment.chore.last_completed_at = db_assignment.completed_at
|
||||
db_assignment.chore.next_due_date = calculate_next_due_date(
|
||||
current_due_date=db_assignment.chore.next_due_date,
|
||||
frequency=db_assignment.chore.frequency,
|
||||
custom_interval_days=db_assignment.chore.custom_interval_days,
|
||||
last_completed_date=db_assignment.chore.last_completed_at.date() if db_assignment.chore.last_completed_at else None
|
||||
)
|
||||
elif not new_status and original_status:
|
||||
db_assignment.completed_at = None
|
||||
history_event = ChoreHistoryEventTypeEnum.REOPENED
|
||||
# Policy: Do not automatically roll back parent chore's due date.
|
||||
|
||||
if history_event:
|
||||
await create_assignment_history_entry(
|
||||
db=db,
|
||||
assignment_id=assignment_id,
|
||||
changed_by_user_id=user_id,
|
||||
event_type=history_event,
|
||||
event_data={"new_status": new_status}
|
||||
)
|
||||
|
||||
await db.flush()
|
||||
|
||||
try:
|
||||
result = await db.execute(
|
||||
select(ChoreAssignment)
|
||||
.where(ChoreAssignment.id == db_assignment.id)
|
||||
.options(
|
||||
selectinload(ChoreAssignment.chore).selectinload(Chore.creator),
|
||||
selectinload(ChoreAssignment.assigned_user)
|
||||
)
|
||||
.where(ChoreAssignment.id == assignment_id)
|
||||
.options(*get_assignment_loader_options())
|
||||
)
|
||||
return result.scalar_one()
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating chore assignment {assignment_id}: {e}", exc_info=True)
|
||||
raise DatabaseIntegrityError(f"Could not update chore assignment {assignment_id}. Error: {str(e)}")
|
||||
logger.error(f"Error updating assignment: {e}", exc_info=True)
|
||||
await db.rollback()
|
||||
raise DatabaseIntegrityError(f"Could not update assignment. Error: {str(e)}")
|
||||
|
||||
async def delete_chore_assignment(
|
||||
db: AsyncSession,
|
||||
@ -483,23 +550,51 @@ async def delete_chore_assignment(
|
||||
if not db_assignment:
|
||||
raise ChoreNotFoundError(assignment_id=assignment_id)
|
||||
|
||||
# Load the chore for permission checking
|
||||
await create_assignment_history_entry(
|
||||
db,
|
||||
assignment_id=assignment_id,
|
||||
changed_by_user_id=user_id,
|
||||
event_type=ChoreHistoryEventTypeEnum.UNASSIGNED,
|
||||
event_data={"unassigned_user_id": db_assignment.assigned_to_user_id}
|
||||
)
|
||||
|
||||
chore = await get_chore_by_id(db, db_assignment.chore_id)
|
||||
if not chore:
|
||||
raise ChoreNotFoundError(chore_id=db_assignment.chore_id)
|
||||
|
||||
# Check permissions
|
||||
if chore.type == ChoreTypeEnum.personal:
|
||||
if chore.created_by_id != user_id:
|
||||
raise PermissionDeniedError(detail="Only the creator can delete personal chore assignments")
|
||||
else: # group chore
|
||||
else:
|
||||
if not await is_user_member(db, chore.group_id, user_id):
|
||||
raise PermissionDeniedError(detail=f"User {user_id} not a member of group {chore.group_id}")
|
||||
|
||||
try:
|
||||
await db.delete(db_assignment)
|
||||
await db.flush() # Ensure deletion is processed within the transaction
|
||||
await db.flush()
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error deleting chore assignment {assignment_id}: {e}", exc_info=True)
|
||||
raise DatabaseIntegrityError(f"Could not delete chore assignment {assignment_id}. Error: {str(e)}")
|
||||
|
||||
def get_chore_loader_options():
|
||||
"""Returns a list of SQLAlchemy loader options for chore relationships."""
|
||||
return [
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.group),
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user),
|
||||
selectinload(Chore.history).selectinload(ChoreHistory.changed_by_user),
|
||||
selectinload(Chore.child_chores).options(
|
||||
selectinload(Chore.assignments).selectinload(ChoreAssignment.assigned_user),
|
||||
selectinload(Chore.history).selectinload(ChoreHistory.changed_by_user),
|
||||
selectinload(Chore.creator),
|
||||
selectinload(Chore.child_chores) # Load grandchildren, adjust depth if needed
|
||||
)
|
||||
]
|
||||
|
||||
def get_assignment_loader_options():
|
||||
return [
|
||||
selectinload(ChoreAssignment.assigned_user),
|
||||
selectinload(ChoreAssignment.history).selectinload(ChoreAssignmentHistory.changed_by_user),
|
||||
selectinload(ChoreAssignment.chore).options(*get_chore_loader_options())
|
||||
]
|
||||
|
@ -4,9 +4,10 @@ from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload, joinedload
|
||||
from sqlalchemy.exc import SQLAlchemyError, IntegrityError, OperationalError # Added import
|
||||
from decimal import Decimal, ROUND_HALF_UP, InvalidOperation as DecimalInvalidOperation
|
||||
from decimal import Decimal, ROUND_HALF_UP, ROUND_DOWN, InvalidOperation as DecimalInvalidOperation
|
||||
from typing import Callable, List as PyList, Optional, Sequence, Dict, defaultdict, Any
|
||||
from datetime import datetime, timezone # Added timezone
|
||||
import json
|
||||
|
||||
from app.models import (
|
||||
Expense as ExpenseModel,
|
||||
@ -19,6 +20,7 @@ from app.models import (
|
||||
Item as ItemModel,
|
||||
ExpenseOverallStatusEnum, # Added
|
||||
ExpenseSplitStatusEnum, # Added
|
||||
RecurrenceTypeEnum,
|
||||
)
|
||||
from app.schemas.expense import ExpenseCreate, ExpenseSplitCreate, ExpenseUpdate # Removed unused ExpenseUpdate
|
||||
from app.core.exceptions import (
|
||||
@ -34,6 +36,7 @@ from app.core.exceptions import (
|
||||
ExpenseOperationError # Added specific exception
|
||||
)
|
||||
from app.models import RecurrencePattern
|
||||
from app.crud.audit import create_financial_audit_log
|
||||
|
||||
# Placeholder for InvalidOperationError if not defined in app.core.exceptions
|
||||
# This should be a proper HTTPException subclass if used in API layer
|
||||
@ -145,13 +148,26 @@ async def create_expense(db: AsyncSession, expense_in: ExpenseCreate, current_us
|
||||
# Re-resolve context if list_id was derived from item
|
||||
final_group_id = await _resolve_expense_context(db, expense_in)
|
||||
|
||||
# Create recurrence pattern if this is a recurring expense
|
||||
recurrence_pattern = None
|
||||
if expense_in.is_recurring and expense_in.recurrence_pattern:
|
||||
# Normalize recurrence type (accept both str and Enum)
|
||||
_rp_type_raw = expense_in.recurrence_pattern.type
|
||||
if isinstance(_rp_type_raw, str):
|
||||
try:
|
||||
_rp_type_enum = RecurrenceTypeEnum[_rp_type_raw.upper()]
|
||||
except KeyError:
|
||||
raise InvalidOperationError(f"Unsupported recurrence type: {_rp_type_raw}")
|
||||
else:
|
||||
_rp_type_enum = _rp_type_raw # assume already RecurrenceTypeEnum
|
||||
|
||||
recurrence_pattern = RecurrencePattern(
|
||||
type=expense_in.recurrence_pattern.type,
|
||||
type=_rp_type_enum,
|
||||
interval=expense_in.recurrence_pattern.interval,
|
||||
days_of_week=expense_in.recurrence_pattern.days_of_week,
|
||||
days_of_week=(
|
||||
','.join(str(d) for d in expense_in.recurrence_pattern.days_of_week)
|
||||
if isinstance(expense_in.recurrence_pattern.days_of_week, (list, tuple))
|
||||
else expense_in.recurrence_pattern.days_of_week
|
||||
),
|
||||
end_date=expense_in.recurrence_pattern.end_date,
|
||||
max_occurrences=expense_in.recurrence_pattern.max_occurrences,
|
||||
created_at=datetime.now(timezone.utc),
|
||||
@ -215,6 +231,13 @@ async def create_expense(db: AsyncSession, expense_in: ExpenseCreate, current_us
|
||||
# await transaction.rollback() # Should be handled by context manager
|
||||
raise ExpenseOperationError("Failed to load expense after creation.")
|
||||
|
||||
await create_financial_audit_log(
|
||||
db=db,
|
||||
user_id=current_user_id,
|
||||
action_type="EXPENSE_CREATED",
|
||||
entity=loaded_expense,
|
||||
)
|
||||
|
||||
# await transaction.commit() # Explicit commit removed, context manager handles it.
|
||||
return loaded_expense
|
||||
|
||||
@ -305,7 +328,7 @@ async def _generate_expense_splits(
|
||||
|
||||
|
||||
async def _create_equal_splits(db: AsyncSession, expense_model: ExpenseModel, expense_in: ExpenseCreate, round_money_func: Callable[[Decimal], Decimal], **kwargs: Any) -> PyList[ExpenseSplitModel]:
|
||||
"""Creates equal splits among users."""
|
||||
"""Creates equal splits among users, distributing any rounding remainder fairly."""
|
||||
|
||||
users_for_splitting = await get_users_for_splitting(
|
||||
db, expense_model.group_id, expense_model.list_id, expense_model.paid_by_user_id
|
||||
@ -314,21 +337,30 @@ async def _create_equal_splits(db: AsyncSession, expense_model: ExpenseModel, ex
|
||||
raise InvalidOperationError("No users found for EQUAL split.")
|
||||
|
||||
num_users = len(users_for_splitting)
|
||||
amount_per_user = round_money_func(expense_model.total_amount / Decimal(num_users))
|
||||
remainder = expense_model.total_amount - (amount_per_user * num_users)
|
||||
# Use floor rounding initially to prevent over-shooting the total
|
||||
amount_per_user_floor = (expense_model.total_amount / Decimal(num_users)).quantize(Decimal("0.01"), rounding=ROUND_DOWN)
|
||||
|
||||
splits = []
|
||||
for i, user in enumerate(users_for_splitting):
|
||||
split_amount = amount_per_user
|
||||
if i == 0 and remainder != Decimal('0'):
|
||||
split_amount = round_money_func(amount_per_user + remainder)
|
||||
# Sort users by ID to ensure deterministic remainder distribution
|
||||
users_for_splitting.sort(key=lambda u: u.id)
|
||||
|
||||
for user in users_for_splitting:
|
||||
splits.append(ExpenseSplitModel(
|
||||
user_id=user.id,
|
||||
owed_amount=split_amount,
|
||||
status=ExpenseSplitStatusEnum.unpaid # Explicitly set default status
|
||||
owed_amount=amount_per_user_floor,
|
||||
status=ExpenseSplitStatusEnum.unpaid
|
||||
))
|
||||
|
||||
# Calculate remainder and distribute pennies one by one
|
||||
current_total = amount_per_user_floor * num_users
|
||||
remainder = expense_model.total_amount - current_total
|
||||
|
||||
pennies_to_distribute = int(remainder * 100)
|
||||
|
||||
for i in range(pennies_to_distribute):
|
||||
# The modulo ensures that if pennies > num_users (should not happen with floor), it still works
|
||||
splits[i % len(splits)].owed_amount += Decimal("0.01")
|
||||
|
||||
return splits
|
||||
|
||||
|
||||
@ -366,7 +398,7 @@ async def _create_exact_amount_splits(db: AsyncSession, expense_model: ExpenseMo
|
||||
|
||||
|
||||
async def _create_percentage_splits(db: AsyncSession, expense_model: ExpenseModel, expense_in: ExpenseCreate, round_money_func: Callable[[Decimal], Decimal], **kwargs: Any) -> PyList[ExpenseSplitModel]:
|
||||
"""Creates splits based on percentages."""
|
||||
"""Creates splits based on percentages, distributing any rounding remainder fairly."""
|
||||
|
||||
if not expense_in.splits_in:
|
||||
raise InvalidOperationError("Splits data is required for PERCENTAGE split type.")
|
||||
@ -398,16 +430,24 @@ async def _create_percentage_splits(db: AsyncSession, expense_model: ExpenseMode
|
||||
if round_money_func(total_percentage) != Decimal("100.00"):
|
||||
raise InvalidOperationError(f"Sum of percentages ({total_percentage}%) is not 100%.")
|
||||
|
||||
# Adjust for rounding differences
|
||||
# Adjust for rounding differences by distributing remainder fairly
|
||||
if current_total != expense_model.total_amount and splits:
|
||||
diff = expense_model.total_amount - current_total
|
||||
splits[-1].owed_amount = round_money_func(splits[-1].owed_amount + diff)
|
||||
|
||||
# Sort by user ID to make distribution deterministic
|
||||
splits.sort(key=lambda s: s.user_id)
|
||||
|
||||
pennies = int(diff * 100)
|
||||
increment = Decimal("0.01") if pennies > 0 else Decimal("-0.01")
|
||||
|
||||
for i in range(abs(pennies)):
|
||||
splits[i % len(splits)].owed_amount += increment
|
||||
|
||||
return splits
|
||||
|
||||
|
||||
async def _create_shares_splits(db: AsyncSession, expense_model: ExpenseModel, expense_in: ExpenseCreate, round_money_func: Callable[[Decimal], Decimal], **kwargs: Any) -> PyList[ExpenseSplitModel]:
|
||||
"""Creates splits based on shares."""
|
||||
"""Creates splits based on shares, distributing any rounding remainder fairly."""
|
||||
|
||||
if not expense_in.splits_in:
|
||||
raise InvalidOperationError("Splits data is required for SHARES split type.")
|
||||
@ -438,10 +478,18 @@ async def _create_shares_splits(db: AsyncSession, expense_model: ExpenseModel, e
|
||||
status=ExpenseSplitStatusEnum.unpaid # Explicitly set default status
|
||||
))
|
||||
|
||||
# Adjust for rounding differences
|
||||
# Adjust for rounding differences by distributing remainder fairly
|
||||
if current_total != expense_model.total_amount and splits:
|
||||
diff = expense_model.total_amount - current_total
|
||||
splits[-1].owed_amount = round_money_func(splits[-1].owed_amount + diff)
|
||||
|
||||
# Sort by user ID to make distribution deterministic
|
||||
splits.sort(key=lambda s: s.user_id)
|
||||
|
||||
pennies = int(diff * 100)
|
||||
increment = Decimal("0.01") if pennies > 0 else Decimal("-0.01")
|
||||
|
||||
for i in range(abs(pennies)):
|
||||
splits[i % len(splits)].owed_amount += increment
|
||||
|
||||
return splits
|
||||
|
||||
@ -611,22 +659,28 @@ async def get_user_accessible_expenses(db: AsyncSession, user_id: int, skip: int
|
||||
)
|
||||
return result.scalars().all()
|
||||
|
||||
async def update_expense(db: AsyncSession, expense_db: ExpenseModel, expense_in: ExpenseUpdate) -> ExpenseModel:
|
||||
async def update_expense(db: AsyncSession, expense_db: ExpenseModel, expense_in: ExpenseUpdate, current_user_id: int) -> ExpenseModel:
|
||||
"""
|
||||
Updates an existing expense.
|
||||
Only allows updates to description, currency, and expense_date to avoid split complexities.
|
||||
Requires version matching for optimistic locking.
|
||||
Updates an expense. For now, only allows simple field updates.
|
||||
More complex updates (like changing split logic) would require a more sophisticated approach.
|
||||
"""
|
||||
if expense_in.version is None:
|
||||
raise InvalidOperationError("Version is required for updating an expense.")
|
||||
|
||||
if expense_db.version != expense_in.version:
|
||||
raise InvalidOperationError(
|
||||
f"Expense '{expense_db.description}' (ID: {expense_db.id}) has been modified. "
|
||||
f"Your version is {expense_in.version}, current version is {expense_db.version}. Please refresh.",
|
||||
# status_code=status.HTTP_409_CONFLICT # This would be for the API layer to set
|
||||
f"Expense '{expense_db.description}' (ID: {expense_db.id}) cannot be updated. "
|
||||
f"Your expected version {expense_in.version} does not match current version {expense_db.version}. Please refresh.",
|
||||
)
|
||||
|
||||
update_data = expense_in.model_dump(exclude_unset=True, exclude={"version"}) # Exclude version itself from data
|
||||
before_state = {c.name: getattr(expense_db, c.name) for c in expense_db.__table__.columns if c.name in expense_in.dict(exclude_unset=True)}
|
||||
# A simple way to handle non-serializable types for JSON
|
||||
for k, v in before_state.items():
|
||||
if isinstance(v, (datetime, Decimal)):
|
||||
before_state[k] = str(v)
|
||||
|
||||
update_data = expense_in.dict(exclude_unset=True, exclude={"version"})
|
||||
|
||||
# Fields that are safe to update without affecting splits or core logic
|
||||
allowed_to_update = {"description", "currency", "expense_date"}
|
||||
|
||||
updated_something = False
|
||||
@ -635,38 +689,41 @@ async def update_expense(db: AsyncSession, expense_db: ExpenseModel, expense_in:
|
||||
setattr(expense_db, field, value)
|
||||
updated_something = True
|
||||
else:
|
||||
# If any other field is present in the update payload, it's an invalid operation for this simple update
|
||||
raise InvalidOperationError(f"Field '{field}' cannot be updated. Only {', '.join(allowed_to_update)} are allowed.")
|
||||
|
||||
if not updated_something and not expense_in.model_fields_set.intersection(allowed_to_update):
|
||||
# No actual updatable fields were provided in the payload, even if others (like version) were.
|
||||
# This could be a non-issue, or an indication of a misuse of the endpoint.
|
||||
# For now, if only version was sent, we still increment if it matched.
|
||||
pass # Or raise InvalidOperationError("No updatable fields provided.")
|
||||
if not updated_something:
|
||||
pass
|
||||
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
expense_db.version += 1
|
||||
expense_db.updated_at = datetime.now(timezone.utc) # Manually update timestamp
|
||||
# db.add(expense_db) # Not strictly necessary as expense_db is already tracked by the session
|
||||
expense_db.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
await db.flush() # Persist changes to the DB and run constraints
|
||||
await db.refresh(expense_db) # Refresh the object from the DB
|
||||
await db.flush()
|
||||
|
||||
after_state = {c.name: getattr(expense_db, c.name) for c in expense_db.__table__.columns if c.name in update_data}
|
||||
for k, v in after_state.items():
|
||||
if isinstance(v, (datetime, Decimal)):
|
||||
after_state[k] = str(v)
|
||||
|
||||
await create_financial_audit_log(
|
||||
db=db,
|
||||
user_id=current_user_id,
|
||||
action_type="EXPENSE_UPDATED",
|
||||
entity=expense_db,
|
||||
details={"before": before_state, "after": after_state}
|
||||
)
|
||||
|
||||
await db.refresh(expense_db)
|
||||
return expense_db
|
||||
except InvalidOperationError: # Re-raise validation errors to be handled by the caller
|
||||
raise
|
||||
except IntegrityError as e:
|
||||
logger.error(f"Database integrity error during expense update for ID {expense_db.id}: {str(e)}", exc_info=True)
|
||||
# The transaction context manager (begin_nested/begin) handles rollback.
|
||||
raise DatabaseIntegrityError(f"Failed to update expense ID {expense_db.id} due to database integrity issue.") from e
|
||||
except SQLAlchemyError as e: # Catch other SQLAlchemy errors
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database transaction error during expense update for ID {expense_db.id}: {str(e)}", exc_info=True)
|
||||
# The transaction context manager (begin_nested/begin) handles rollback.
|
||||
raise DatabaseTransactionError(f"Failed to update expense ID {expense_db.id} due to a database transaction error.") from e
|
||||
# No generic Exception catch here, let other unexpected errors propagate if not SQLAlchemy related.
|
||||
|
||||
|
||||
async def delete_expense(db: AsyncSession, expense_db: ExpenseModel, expected_version: Optional[int] = None) -> None:
|
||||
async def delete_expense(db: AsyncSession, expense_db: ExpenseModel, current_user_id: int, expected_version: Optional[int] = None) -> None:
|
||||
"""
|
||||
Deletes an expense. Requires version matching if expected_version is provided.
|
||||
Associated ExpenseSplits are cascade deleted by the database foreign key constraint.
|
||||
@ -675,23 +732,33 @@ async def delete_expense(db: AsyncSession, expense_db: ExpenseModel, expected_ve
|
||||
raise InvalidOperationError(
|
||||
f"Expense '{expense_db.description}' (ID: {expense_db.id}) cannot be deleted. "
|
||||
f"Your expected version {expected_version} does not match current version {expense_db.version}. Please refresh.",
|
||||
# status_code=status.HTTP_409_CONFLICT
|
||||
)
|
||||
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
details = {c.name: getattr(expense_db, c.name) for c in expense_db.__table__.columns}
|
||||
for k, v in details.items():
|
||||
if isinstance(v, (datetime, Decimal)):
|
||||
details[k] = str(v)
|
||||
|
||||
expense_id_for_log = expense_db.id
|
||||
|
||||
await create_financial_audit_log(
|
||||
db=db,
|
||||
user_id=current_user_id,
|
||||
action_type="EXPENSE_DELETED",
|
||||
entity=expense_db,
|
||||
details=details
|
||||
)
|
||||
|
||||
await db.delete(expense_db)
|
||||
await db.flush() # Ensure the delete operation is sent to the database
|
||||
except InvalidOperationError: # Re-raise validation errors
|
||||
raise
|
||||
await db.flush()
|
||||
except IntegrityError as e:
|
||||
logger.error(f"Database integrity error during expense deletion for ID {expense_db.id}: {str(e)}", exc_info=True)
|
||||
# The transaction context manager (begin_nested/begin) handles rollback.
|
||||
raise DatabaseIntegrityError(f"Failed to delete expense ID {expense_db.id} due to database integrity issue.") from e
|
||||
except SQLAlchemyError as e: # Catch other SQLAlchemy errors
|
||||
logger.error(f"Database transaction error during expense deletion for ID {expense_db.id}: {str(e)}", exc_info=True)
|
||||
# The transaction context manager (begin_nested/begin) handles rollback.
|
||||
raise DatabaseTransactionError(f"Failed to delete expense ID {expense_db.id} due to a database transaction error.") from e
|
||||
logger.error(f"Database integrity error during expense deletion for ID {expense_id_for_log}: {str(e)}", exc_info=True)
|
||||
raise DatabaseIntegrityError(f"Failed to delete expense ID {expense_id_for_log} due to database integrity issue.") from e
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database transaction error during expense deletion for ID {expense_id_for_log}: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to delete expense ID {expense_id_for_log} due to a database transaction error.") from e
|
||||
return None
|
||||
|
||||
# Note: The InvalidOperationError is a simple ValueError placeholder.
|
||||
|
@ -1,15 +1,15 @@
|
||||
# app/crud/group.py
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload # For eager loading members
|
||||
from sqlalchemy.orm import selectinload, joinedload, contains_eager
|
||||
from sqlalchemy.exc import SQLAlchemyError, IntegrityError, OperationalError
|
||||
from typing import Optional, List
|
||||
from sqlalchemy import delete, func
|
||||
import logging # Add logging import
|
||||
from typing import Optional, List, Dict, Any, Tuple
|
||||
from sqlalchemy import delete, func, and_, or_, update, desc
|
||||
import logging
|
||||
from datetime import datetime, timezone, timedelta
|
||||
|
||||
from app.models import User as UserModel, Group as GroupModel, UserGroup as UserGroupModel
|
||||
from app.schemas.group import GroupCreate
|
||||
from app.models import UserRoleEnum # Import enum
|
||||
from app.models import User as UserModel, Group as GroupModel, UserGroup as UserGroupModel, List as ListModel, Chore as ChoreModel, ChoreAssignment as ChoreAssignmentModel
|
||||
from app.schemas.group import GroupCreate, GroupPublic
|
||||
from app.models import UserRoleEnum
|
||||
from app.core.exceptions import (
|
||||
GroupOperationError,
|
||||
GroupNotFoundError,
|
||||
@ -18,8 +18,11 @@ from app.core.exceptions import (
|
||||
DatabaseQueryError,
|
||||
DatabaseTransactionError,
|
||||
GroupMembershipError,
|
||||
GroupPermissionError # Import GroupPermissionError
|
||||
GroupPermissionError,
|
||||
PermissionDeniedError,
|
||||
InvalidOperationError
|
||||
)
|
||||
from app.core.cache import cache
|
||||
|
||||
logger = logging.getLogger(__name__) # Initialize logger
|
||||
|
||||
@ -79,7 +82,8 @@ async def get_user_groups(db: AsyncSession, user_id: int) -> List[GroupModel]:
|
||||
.options(
|
||||
selectinload(GroupModel.member_associations).options(
|
||||
selectinload(UserGroupModel.user)
|
||||
)
|
||||
),
|
||||
selectinload(GroupModel.chore_history) # Eager load chore history
|
||||
)
|
||||
)
|
||||
return result.scalars().all()
|
||||
@ -88,21 +92,18 @@ async def get_user_groups(db: AsyncSession, user_id: int) -> List[GroupModel]:
|
||||
except SQLAlchemyError as e:
|
||||
raise DatabaseQueryError(f"Failed to query user groups: {str(e)}")
|
||||
|
||||
@cache(expire_time=1800, key_prefix="group") # Cache for 30 minutes
|
||||
async def get_group_by_id(db: AsyncSession, group_id: int) -> Optional[GroupModel]:
|
||||
"""Gets a single group by its ID, optionally loading members."""
|
||||
try:
|
||||
"""Get a group by its ID with caching, including member associations and chore history."""
|
||||
result = await db.execute(
|
||||
select(GroupModel)
|
||||
.where(GroupModel.id == group_id)
|
||||
.options(
|
||||
selectinload(GroupModel.member_associations).selectinload(UserGroupModel.user)
|
||||
selectinload(GroupModel.member_associations).selectinload(UserGroupModel.user),
|
||||
selectinload(GroupModel.chore_history)
|
||||
)
|
||||
)
|
||||
return result.scalars().first()
|
||||
except OperationalError as e:
|
||||
raise DatabaseConnectionError(f"Failed to connect to database: {str(e)}")
|
||||
except SQLAlchemyError as e:
|
||||
raise DatabaseQueryError(f"Failed to query group: {str(e)}")
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def is_user_member(db: AsyncSession, group_id: int, user_id: int) -> bool:
|
||||
"""Checks if a user is a member of a specific group."""
|
||||
@ -146,6 +147,13 @@ async def add_user_to_group(db: AsyncSession, group_id: int, user_id: int, role:
|
||||
db.add(db_user_group)
|
||||
await db.flush() # Assigns ID to db_user_group
|
||||
|
||||
# Optimistic locking: bump group version atomically
|
||||
await db.execute(
|
||||
update(GroupModel)
|
||||
.where(GroupModel.id == group_id)
|
||||
.values(version=GroupModel.version + 1)
|
||||
)
|
||||
|
||||
# Eagerly load the 'user' and 'group' relationships for the response
|
||||
stmt = (
|
||||
select(UserGroupModel)
|
||||
@ -181,7 +189,16 @@ async def remove_user_from_group(db: AsyncSession, group_id: int, user_id: int)
|
||||
.where(UserGroupModel.group_id == group_id, UserGroupModel.user_id == user_id)
|
||||
.returning(UserGroupModel.id)
|
||||
)
|
||||
return result.scalar_one_or_none() is not None
|
||||
deleted = result.scalar_one_or_none() is not None
|
||||
|
||||
if deleted:
|
||||
await db.execute(
|
||||
update(GroupModel)
|
||||
.where(GroupModel.id == group_id)
|
||||
.values(version=GroupModel.version + 1)
|
||||
)
|
||||
|
||||
return deleted
|
||||
except OperationalError as e:
|
||||
logger.error(f"Database connection error while removing user from group: {str(e)}", exc_info=True)
|
||||
raise DatabaseConnectionError(f"Database connection error: {str(e)}")
|
||||
@ -271,7 +288,7 @@ async def check_user_role_in_group(
|
||||
# If role is sufficient, return None
|
||||
return None
|
||||
|
||||
async def delete_group(db: AsyncSession, group_id: int) -> None:
|
||||
async def delete_group(db: AsyncSession, group_id: int, *, expected_version: int | None = None) -> None:
|
||||
"""
|
||||
Deletes a group and all its associated data (members, invites, lists, etc.).
|
||||
The cascade delete in the models will handle the deletion of related records.
|
||||
@ -286,6 +303,12 @@ async def delete_group(db: AsyncSession, group_id: int) -> None:
|
||||
if not group:
|
||||
raise GroupNotFoundError(group_id)
|
||||
|
||||
# Optimistic locking – ensure caller had latest version
|
||||
if expected_version is not None and group.version != expected_version:
|
||||
raise InvalidOperationError(
|
||||
f"Version mismatch for group {group_id}. Current version is {group.version}, expected {expected_version}."
|
||||
)
|
||||
|
||||
# Delete the group - cascading delete will handle related records
|
||||
await db.delete(group)
|
||||
await db.flush()
|
||||
|
82
be/app/crud/history.py
Normal file
82
be/app/crud/history.py
Normal file
@ -0,0 +1,82 @@
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload
|
||||
from typing import List, Optional, Any, Dict
|
||||
|
||||
from app.models import ChoreHistory, ChoreAssignmentHistory, ChoreHistoryEventTypeEnum, User, Chore, Group
|
||||
from app.schemas.chore import ChoreHistoryPublic, ChoreAssignmentHistoryPublic
|
||||
|
||||
async def create_chore_history_entry(
|
||||
db: AsyncSession,
|
||||
*,
|
||||
chore_id: Optional[int],
|
||||
group_id: Optional[int],
|
||||
changed_by_user_id: Optional[int],
|
||||
event_type: ChoreHistoryEventTypeEnum,
|
||||
event_data: Optional[Dict[str, Any]] = None,
|
||||
) -> ChoreHistory:
|
||||
"""Logs an event in the chore history."""
|
||||
history_entry = ChoreHistory(
|
||||
chore_id=chore_id,
|
||||
group_id=group_id,
|
||||
changed_by_user_id=changed_by_user_id,
|
||||
event_type=event_type,
|
||||
event_data=event_data or {},
|
||||
)
|
||||
db.add(history_entry)
|
||||
await db.flush()
|
||||
await db.refresh(history_entry)
|
||||
return history_entry
|
||||
|
||||
async def create_assignment_history_entry(
|
||||
db: AsyncSession,
|
||||
*,
|
||||
assignment_id: int,
|
||||
changed_by_user_id: int,
|
||||
event_type: ChoreHistoryEventTypeEnum,
|
||||
event_data: Optional[Dict[str, Any]] = None,
|
||||
) -> ChoreAssignmentHistory:
|
||||
"""Logs an event in the chore assignment history."""
|
||||
history_entry = ChoreAssignmentHistory(
|
||||
assignment_id=assignment_id,
|
||||
changed_by_user_id=changed_by_user_id,
|
||||
event_type=event_type,
|
||||
event_data=event_data or {},
|
||||
)
|
||||
db.add(history_entry)
|
||||
await db.flush()
|
||||
await db.refresh(history_entry)
|
||||
return history_entry
|
||||
|
||||
async def get_chore_history(db: AsyncSession, chore_id: int) -> List[ChoreHistory]:
|
||||
"""Gets all history for a specific chore."""
|
||||
result = await db.execute(
|
||||
select(ChoreHistory)
|
||||
.where(ChoreHistory.chore_id == chore_id)
|
||||
.options(selectinload(ChoreHistory.changed_by_user))
|
||||
.order_by(ChoreHistory.timestamp.desc())
|
||||
)
|
||||
return result.scalars().all()
|
||||
|
||||
async def get_assignment_history(db: AsyncSession, assignment_id: int) -> List[ChoreAssignmentHistory]:
|
||||
"""Gets all history for a specific assignment."""
|
||||
result = await db.execute(
|
||||
select(ChoreAssignmentHistory)
|
||||
.where(ChoreAssignmentHistory.assignment_id == assignment_id)
|
||||
.options(selectinload(ChoreAssignmentHistory.changed_by_user))
|
||||
.order_by(ChoreAssignmentHistory.timestamp.desc())
|
||||
)
|
||||
return result.scalars().all()
|
||||
|
||||
async def get_group_chore_history(db: AsyncSession, group_id: int) -> List[ChoreHistory]:
|
||||
"""Gets all chore-related history for a group, including chore-specific and group-level events."""
|
||||
result = await db.execute(
|
||||
select(ChoreHistory)
|
||||
.where(ChoreHistory.group_id == group_id)
|
||||
.options(
|
||||
selectinload(ChoreHistory.changed_by_user),
|
||||
selectinload(ChoreHistory.chore)
|
||||
)
|
||||
.order_by(ChoreHistory.timestamp.desc())
|
||||
)
|
||||
return result.scalars().all()
|
@ -1,26 +1,24 @@
|
||||
# app/crud/invite.py
|
||||
import logging # Add logging import
|
||||
import logging
|
||||
import secrets
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload # Ensure selectinload is imported
|
||||
from sqlalchemy import delete # Import delete statement
|
||||
from sqlalchemy.orm import selectinload
|
||||
from sqlalchemy import delete
|
||||
from sqlalchemy.exc import SQLAlchemyError, OperationalError, IntegrityError
|
||||
from typing import Optional
|
||||
|
||||
from app.models import Invite as InviteModel, Group as GroupModel, User as UserModel # Import related models for selectinload
|
||||
from app.models import Invite as InviteModel, Group as GroupModel, User as UserModel
|
||||
from app.core.exceptions import (
|
||||
DatabaseConnectionError,
|
||||
DatabaseIntegrityError,
|
||||
DatabaseQueryError,
|
||||
DatabaseTransactionError,
|
||||
InviteOperationError # Add new specific exception
|
||||
InviteOperationError
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__) # Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Invite codes should be reasonably unique, but handle potential collision
|
||||
MAX_CODE_GENERATION_ATTEMPTS = 5
|
||||
|
||||
async def deactivate_all_active_invites_for_group(db: AsyncSession, group_id: int):
|
||||
@ -35,15 +33,13 @@ async def deactivate_all_active_invites_for_group(db: AsyncSession, group_id: in
|
||||
active_invites = result.scalars().all()
|
||||
|
||||
if not active_invites:
|
||||
return # No active invites to deactivate
|
||||
return
|
||||
|
||||
for invite in active_invites:
|
||||
invite.is_active = False
|
||||
db.add(invite)
|
||||
await db.flush() # Flush changes within this transaction block
|
||||
await db.flush()
|
||||
|
||||
# await db.flush() # Removed: Rely on caller to flush/commit
|
||||
# No explicit commit here, assuming it's part of a larger transaction or caller handles commit.
|
||||
except OperationalError as e:
|
||||
logger.error(f"Database connection error deactivating invites for group {group_id}: {str(e)}", exc_info=True)
|
||||
raise DatabaseConnectionError(f"DB connection error deactivating invites for group {group_id}: {str(e)}")
|
||||
@ -51,12 +47,11 @@ async def deactivate_all_active_invites_for_group(db: AsyncSession, group_id: in
|
||||
logger.error(f"Unexpected SQLAlchemy error deactivating invites for group {group_id}: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"DB transaction error deactivating invites for group {group_id}: {str(e)}")
|
||||
|
||||
async def create_invite(db: AsyncSession, group_id: int, creator_id: int, expires_in_days: int = 365 * 100) -> Optional[InviteModel]: # Default to 100 years
|
||||
async def create_invite(db: AsyncSession, group_id: int, creator_id: int, expires_in_days: int = 365 * 100) -> Optional[InviteModel]:
|
||||
"""Creates a new invite code for a group, deactivating any existing active ones for that group first."""
|
||||
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
# Deactivate existing active invites for this group
|
||||
await deactivate_all_active_invites_for_group(db, group_id)
|
||||
|
||||
expires_at = datetime.now(timezone.utc) + timedelta(days=expires_in_days)
|
||||
@ -101,7 +96,7 @@ async def create_invite(db: AsyncSession, group_id: int, creator_id: int, expire
|
||||
raise InviteOperationError("Failed to load invite after creation and flush.")
|
||||
|
||||
return loaded_invite
|
||||
except InviteOperationError: # Already specific, re-raise
|
||||
except InviteOperationError:
|
||||
raise
|
||||
except IntegrityError as e:
|
||||
logger.error(f"Database integrity error during invite creation for group {group_id}: {str(e)}", exc_info=True)
|
||||
@ -121,13 +116,12 @@ async def get_active_invite_for_group(db: AsyncSession, group_id: int) -> Option
|
||||
select(InviteModel).where(
|
||||
InviteModel.group_id == group_id,
|
||||
InviteModel.is_active == True,
|
||||
InviteModel.expires_at > now # Still respect expiry, even if very long
|
||||
InviteModel.expires_at > now
|
||||
)
|
||||
.order_by(InviteModel.created_at.desc()) # Get the most recent one if multiple (should not happen)
|
||||
.limit(1)
|
||||
.options(
|
||||
selectinload(InviteModel.group), # Eager load group
|
||||
selectinload(InviteModel.creator) # Eager load creator
|
||||
selectinload(InviteModel.group),
|
||||
selectinload(InviteModel.creator)
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
@ -166,10 +160,9 @@ async def deactivate_invite(db: AsyncSession, invite: InviteModel) -> InviteMode
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
invite.is_active = False
|
||||
db.add(invite) # Add to session to track change
|
||||
await db.flush() # Persist is_active change
|
||||
db.add(invite)
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships
|
||||
stmt = (
|
||||
select(InviteModel)
|
||||
.where(InviteModel.id == invite.id)
|
||||
@ -181,7 +174,7 @@ async def deactivate_invite(db: AsyncSession, invite: InviteModel) -> InviteMode
|
||||
result = await db.execute(stmt)
|
||||
updated_invite = result.scalar_one_or_none()
|
||||
|
||||
if updated_invite is None: # Should not happen as invite is passed in
|
||||
if updated_invite is None:
|
||||
raise InviteOperationError("Failed to load invite after deactivation.")
|
||||
|
||||
return updated_invite
|
||||
@ -192,8 +185,3 @@ async def deactivate_invite(db: AsyncSession, invite: InviteModel) -> InviteMode
|
||||
logger.error(f"Unexpected SQLAlchemy error deactivating invite: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"DB transaction error deactivating invite: {str(e)}")
|
||||
|
||||
# Ensure InviteOperationError is defined in app.core.exceptions
|
||||
# Example: class InviteOperationError(AppException): pass
|
||||
|
||||
# Optional: Function to periodically delete old, inactive invites
|
||||
# async def cleanup_old_invites(db: AsyncSession, older_than_days: int = 30): ...
|
@ -1,15 +1,14 @@
|
||||
# app/crud/item.py
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload # Ensure selectinload is imported
|
||||
from sqlalchemy import delete as sql_delete, update as sql_update # Use aliases
|
||||
from sqlalchemy.orm import selectinload
|
||||
from sqlalchemy import delete as sql_delete, update as sql_update
|
||||
from sqlalchemy.exc import SQLAlchemyError, IntegrityError, OperationalError
|
||||
from typing import Optional, List as PyList
|
||||
from datetime import datetime, timezone
|
||||
import logging # Add logging import
|
||||
import logging
|
||||
from sqlalchemy import func
|
||||
|
||||
from app.models import Item as ItemModel, User as UserModel # Import UserModel for type hints if needed for selectinload
|
||||
from app.models import Item as ItemModel, User as UserModel
|
||||
from app.schemas.item import ItemCreate, ItemUpdate
|
||||
from app.core.exceptions import (
|
||||
ItemNotFoundError,
|
||||
@ -18,16 +17,15 @@ from app.core.exceptions import (
|
||||
DatabaseQueryError,
|
||||
DatabaseTransactionError,
|
||||
ConflictError,
|
||||
ItemOperationError # Add if specific item operation errors are needed
|
||||
ItemOperationError
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__) # Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
async def create_item(db: AsyncSession, item_in: ItemCreate, list_id: int, user_id: int) -> ItemModel:
|
||||
"""Creates a new item record for a specific list, setting its position."""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
# Get the current max position in the list
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction: # Start transaction
|
||||
max_pos_stmt = select(func.max(ItemModel.position)).where(ItemModel.list_id == list_id)
|
||||
max_pos_result = await db.execute(max_pos_stmt)
|
||||
max_pos = max_pos_result.scalar_one_or_none() or 0
|
||||
@ -35,29 +33,28 @@ async def create_item(db: AsyncSession, item_in: ItemCreate, list_id: int, user_
|
||||
db_item = ItemModel(
|
||||
name=item_in.name,
|
||||
quantity=item_in.quantity,
|
||||
category_id=item_in.category_id,
|
||||
list_id=list_id,
|
||||
added_by_id=user_id,
|
||||
is_complete=False,
|
||||
position=max_pos + 1 # Set the new position
|
||||
position=max_pos + 1
|
||||
)
|
||||
db.add(db_item)
|
||||
await db.flush() # Assigns ID
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships
|
||||
stmt = (
|
||||
select(ItemModel)
|
||||
.where(ItemModel.id == db_item.id)
|
||||
.options(
|
||||
selectinload(ItemModel.added_by_user),
|
||||
selectinload(ItemModel.completed_by_user) # Will be None but loads relationship
|
||||
selectinload(ItemModel.completed_by_user)
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
loaded_item = result.scalar_one_or_none()
|
||||
|
||||
if loaded_item is None:
|
||||
# await transaction.rollback() # Redundant, context manager handles rollback on exception
|
||||
raise ItemOperationError("Failed to load item after creation.") # Define ItemOperationError
|
||||
raise ItemOperationError("Failed to load item after creation.")
|
||||
|
||||
return loaded_item
|
||||
except IntegrityError as e:
|
||||
@ -69,8 +66,6 @@ async def create_item(db: AsyncSession, item_in: ItemCreate, list_id: int, user_
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Unexpected SQLAlchemy error during item creation: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to create item: {str(e)}")
|
||||
# Removed generic Exception block as SQLAlchemyError should cover DB issues,
|
||||
# and context manager handles rollback.
|
||||
|
||||
async def get_items_by_list_id(db: AsyncSession, list_id: int) -> PyList[ItemModel]:
|
||||
"""Gets all items belonging to a specific list, ordered by creation time."""
|
||||
@ -100,7 +95,7 @@ async def get_item_by_id(db: AsyncSession, item_id: int) -> Optional[ItemModel]:
|
||||
.options(
|
||||
selectinload(ItemModel.added_by_user),
|
||||
selectinload(ItemModel.completed_by_user),
|
||||
selectinload(ItemModel.list) # Often useful to get the parent list
|
||||
selectinload(ItemModel.list)
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
@ -113,7 +108,7 @@ async def get_item_by_id(db: AsyncSession, item_id: int) -> Optional[ItemModel]:
|
||||
async def update_item(db: AsyncSession, item_db: ItemModel, item_in: ItemUpdate, user_id: int) -> ItemModel:
|
||||
"""Updates an existing item record, checking for version conflicts and handling reordering."""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction: # Start transaction
|
||||
if item_db.version != item_in.version:
|
||||
raise ConflictError(
|
||||
f"Item '{item_db.name}' (ID: {item_db.id}) has been modified. "
|
||||
@ -122,31 +117,26 @@ async def update_item(db: AsyncSession, item_db: ItemModel, item_in: ItemUpdate,
|
||||
|
||||
update_data = item_in.model_dump(exclude_unset=True, exclude={'version'})
|
||||
|
||||
# --- Handle Reordering ---
|
||||
if 'position' in update_data:
|
||||
new_position = update_data.pop('position') # Remove from update_data to handle separately
|
||||
if 'category_id' in update_data:
|
||||
item_db.category_id = update_data.pop('category_id')
|
||||
|
||||
if 'position' in update_data:
|
||||
new_position = update_data.pop('position')
|
||||
|
||||
# We need the full list to reorder, making sure it's loaded and ordered
|
||||
list_id = item_db.list_id
|
||||
stmt = select(ItemModel).where(ItemModel.list_id == list_id).order_by(ItemModel.position.asc(), ItemModel.created_at.asc())
|
||||
result = await db.execute(stmt)
|
||||
items_in_list = result.scalars().all()
|
||||
|
||||
# Find the item to move
|
||||
item_to_move = next((it for it in items_in_list if it.id == item_db.id), None)
|
||||
if item_to_move:
|
||||
items_in_list.remove(item_to_move)
|
||||
# Insert at the new position (adjust for 1-based index from frontend)
|
||||
# Clamp position to be within bounds
|
||||
insert_pos = max(0, min(new_position - 1, len(items_in_list)))
|
||||
items_in_list.insert(insert_pos, item_to_move)
|
||||
|
||||
# Re-assign positions
|
||||
for i, item in enumerate(items_in_list):
|
||||
item.position = i + 1
|
||||
|
||||
# --- End Handle Reordering ---
|
||||
|
||||
if 'is_complete' in update_data:
|
||||
if update_data['is_complete'] is True:
|
||||
if item_db.completed_by_id is None:
|
||||
@ -158,10 +148,9 @@ async def update_item(db: AsyncSession, item_db: ItemModel, item_in: ItemUpdate,
|
||||
setattr(item_db, key, value)
|
||||
|
||||
item_db.version += 1
|
||||
db.add(item_db) # Mark as dirty
|
||||
db.add(item_db)
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships
|
||||
stmt = (
|
||||
select(ItemModel)
|
||||
.where(ItemModel.id == item_db.id)
|
||||
@ -174,8 +163,7 @@ async def update_item(db: AsyncSession, item_db: ItemModel, item_in: ItemUpdate,
|
||||
result = await db.execute(stmt)
|
||||
updated_item = result.scalar_one_or_none()
|
||||
|
||||
if updated_item is None: # Should not happen
|
||||
# Rollback will be handled by context manager on raise
|
||||
if updated_item is None:
|
||||
raise ItemOperationError("Failed to load item after update.")
|
||||
|
||||
return updated_item
|
||||
@ -185,7 +173,7 @@ async def update_item(db: AsyncSession, item_db: ItemModel, item_in: ItemUpdate,
|
||||
except OperationalError as e:
|
||||
logger.error(f"Database connection error while updating item: {str(e)}", exc_info=True)
|
||||
raise DatabaseConnectionError(f"Database connection error while updating item: {str(e)}")
|
||||
except ConflictError: # Re-raise ConflictError, rollback handled by context manager
|
||||
except ConflictError:
|
||||
raise
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Unexpected SQLAlchemy error during item update: {str(e)}", exc_info=True)
|
||||
@ -196,14 +184,9 @@ async def delete_item(db: AsyncSession, item_db: ItemModel) -> None:
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
await db.delete(item_db)
|
||||
# await transaction.commit() # Removed
|
||||
# No return needed for None
|
||||
except OperationalError as e:
|
||||
logger.error(f"Database connection error while deleting item: {str(e)}", exc_info=True)
|
||||
raise DatabaseConnectionError(f"Database connection error while deleting item: {str(e)}")
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Unexpected SQLAlchemy error while deleting item: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to delete item: {str(e)}")
|
||||
|
||||
# Ensure ItemOperationError is defined in app.core.exceptions if used
|
||||
# Example: class ItemOperationError(AppException): pass
|
@ -1,11 +1,11 @@
|
||||
# app/crud/list.py
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload, joinedload
|
||||
from sqlalchemy import or_, and_, delete as sql_delete, func as sql_func, desc
|
||||
from sqlalchemy.exc import SQLAlchemyError, IntegrityError, OperationalError
|
||||
from typing import Optional, List as PyList
|
||||
import logging # Add logging import
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
|
||||
from app.schemas.list import ListStatus
|
||||
from app.models import List as ListModel, UserGroup as UserGroupModel, Item as ItemModel
|
||||
@ -22,12 +22,12 @@ from app.core.exceptions import (
|
||||
ListOperationError
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__) # Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
async def create_list(db: AsyncSession, list_in: ListCreate, creator_id: int) -> ListModel:
|
||||
"""Creates a new list record."""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction: # Start transaction
|
||||
db_list = ListModel(
|
||||
name=list_in.name,
|
||||
description=list_in.description,
|
||||
@ -36,16 +36,14 @@ async def create_list(db: AsyncSession, list_in: ListCreate, creator_id: int) ->
|
||||
is_complete=False
|
||||
)
|
||||
db.add(db_list)
|
||||
await db.flush() # Assigns ID
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships for the response
|
||||
stmt = (
|
||||
select(ListModel)
|
||||
.where(ListModel.id == db_list.id)
|
||||
.options(
|
||||
selectinload(ListModel.creator),
|
||||
selectinload(ListModel.group)
|
||||
# selectinload(ListModel.items) # Optionally add if items are always needed in response
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
@ -65,7 +63,7 @@ async def create_list(db: AsyncSession, list_in: ListCreate, creator_id: int) ->
|
||||
logger.error(f"Unexpected SQLAlchemy error during list creation: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to create list: {str(e)}")
|
||||
|
||||
async def get_lists_for_user(db: AsyncSession, user_id: int) -> PyList[ListModel]:
|
||||
async def get_lists_for_user(db: AsyncSession, user_id: int, include_archived: bool = False) -> PyList[ListModel]:
|
||||
"""Gets all lists accessible by a user."""
|
||||
try:
|
||||
group_ids_result = await db.execute(
|
||||
@ -79,19 +77,19 @@ async def get_lists_for_user(db: AsyncSession, user_id: int) -> PyList[ListModel
|
||||
if user_group_ids:
|
||||
conditions.append(ListModel.group_id.in_(user_group_ids))
|
||||
|
||||
query = (
|
||||
select(ListModel)
|
||||
.where(or_(*conditions))
|
||||
.options(
|
||||
query = select(ListModel).where(or_(*conditions))
|
||||
|
||||
if not include_archived:
|
||||
query = query.where(ListModel.archived_at.is_(None))
|
||||
|
||||
query = query.options(
|
||||
selectinload(ListModel.creator),
|
||||
selectinload(ListModel.group),
|
||||
selectinload(ListModel.items).options(
|
||||
joinedload(ItemModel.added_by_user),
|
||||
joinedload(ItemModel.completed_by_user)
|
||||
)
|
||||
)
|
||||
.order_by(ListModel.updated_at.desc())
|
||||
)
|
||||
).order_by(ListModel.updated_at.desc())
|
||||
|
||||
result = await db.execute(query)
|
||||
return result.scalars().all()
|
||||
@ -129,7 +127,7 @@ async def update_list(db: AsyncSession, list_db: ListModel, list_in: ListUpdate)
|
||||
"""Updates an existing list record, checking for version conflicts."""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
if list_db.version != list_in.version: # list_db here is the one passed in, pre-loaded by API layer
|
||||
if list_db.version != list_in.version:
|
||||
raise ConflictError(
|
||||
f"List '{list_db.name}' (ID: {list_db.id}) has been modified. "
|
||||
f"Your version is {list_in.version}, current version is {list_db.version}. Please refresh."
|
||||
@ -145,20 +143,18 @@ async def update_list(db: AsyncSession, list_db: ListModel, list_in: ListUpdate)
|
||||
db.add(list_db) # Add the already attached list_db to mark it dirty for the session
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships for the response
|
||||
stmt = (
|
||||
select(ListModel)
|
||||
.where(ListModel.id == list_db.id)
|
||||
.options(
|
||||
selectinload(ListModel.creator),
|
||||
selectinload(ListModel.group)
|
||||
# selectinload(ListModel.items) # Optionally add if items are always needed in response
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
updated_list = result.scalar_one_or_none()
|
||||
|
||||
if updated_list is None: # Should not happen
|
||||
if updated_list is None:
|
||||
raise ListOperationError("Failed to load list after update.")
|
||||
|
||||
return updated_list
|
||||
@ -174,17 +170,35 @@ async def update_list(db: AsyncSession, list_db: ListModel, list_in: ListUpdate)
|
||||
logger.error(f"Unexpected SQLAlchemy error during list update: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to update list: {str(e)}")
|
||||
|
||||
async def delete_list(db: AsyncSession, list_db: ListModel) -> None:
|
||||
"""Deletes a list record. Version check should be done by the caller (API endpoint)."""
|
||||
async def archive_list(db: AsyncSession, list_db: ListModel) -> ListModel:
|
||||
"""Archives a list record by setting the archived_at timestamp."""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction: # Standardize transaction
|
||||
await db.delete(list_db)
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
list_db.archived_at = datetime.now(timezone.utc)
|
||||
await db.flush()
|
||||
await db.refresh(list_db)
|
||||
return list_db
|
||||
except OperationalError as e:
|
||||
logger.error(f"Database connection error while deleting list: {str(e)}", exc_info=True)
|
||||
raise DatabaseConnectionError(f"Database connection error while deleting list: {str(e)}")
|
||||
logger.error(f"Database connection error while archiving list: {str(e)}", exc_info=True)
|
||||
raise DatabaseConnectionError(f"Database connection error while archiving list: {str(e)}")
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Unexpected SQLAlchemy error while deleting list: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to delete list: {str(e)}")
|
||||
logger.error(f"Unexpected SQLAlchemy error while archiving list: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to archive list: {str(e)}")
|
||||
|
||||
async def unarchive_list(db: AsyncSession, list_db: ListModel) -> ListModel:
|
||||
"""Unarchives a list record by setting the archived_at timestamp to None."""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
list_db.archived_at = None
|
||||
await db.flush()
|
||||
await db.refresh(list_db)
|
||||
return list_db
|
||||
except OperationalError as e:
|
||||
logger.error(f"Database connection error while unarchiving list: {str(e)}", exc_info=True)
|
||||
raise DatabaseConnectionError(f"Database connection error while unarchiving list: {str(e)}")
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Unexpected SQLAlchemy error while unarchiving list: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to unarchive list: {str(e)}")
|
||||
|
||||
async def check_list_permission(db: AsyncSession, list_id: int, user_id: int, require_creator: bool = False) -> ListModel:
|
||||
"""Fetches a list and verifies user permission."""
|
||||
@ -257,7 +271,6 @@ async def get_list_by_name_and_group(
|
||||
Used for conflict resolution when creating lists.
|
||||
"""
|
||||
try:
|
||||
# Base query for the list itself
|
||||
base_query = select(ListModel).where(ListModel.name == name)
|
||||
|
||||
if group_id is not None:
|
||||
@ -265,7 +278,6 @@ async def get_list_by_name_and_group(
|
||||
else:
|
||||
base_query = base_query.where(ListModel.group_id.is_(None))
|
||||
|
||||
# Add eager loading for common relationships
|
||||
base_query = base_query.options(
|
||||
selectinload(ListModel.creator),
|
||||
selectinload(ListModel.group)
|
||||
@ -277,19 +289,17 @@ async def get_list_by_name_and_group(
|
||||
if not target_list:
|
||||
return None
|
||||
|
||||
# Permission check
|
||||
is_creator = target_list.created_by_id == user_id
|
||||
|
||||
if is_creator:
|
||||
return target_list
|
||||
|
||||
if target_list.group_id:
|
||||
from app.crud.group import is_user_member # Assuming this is a quick check not needing its own transaction
|
||||
from app.crud.group import is_user_member
|
||||
is_member_of_group = await is_user_member(db, group_id=target_list.group_id, user_id=user_id)
|
||||
if is_member_of_group:
|
||||
return target_list
|
||||
|
||||
# If not creator and (not a group list or not a member of the group list)
|
||||
return None
|
||||
|
||||
except OperationalError as e:
|
||||
@ -306,21 +316,16 @@ async def get_lists_statuses_by_ids(db: AsyncSession, list_ids: PyList[int], use
|
||||
return []
|
||||
|
||||
try:
|
||||
# First, get the groups the user is a member of
|
||||
group_ids_result = await db.execute(
|
||||
select(UserGroupModel.group_id).where(UserGroupModel.user_id == user_id)
|
||||
)
|
||||
user_group_ids = group_ids_result.scalars().all()
|
||||
|
||||
# Build the permission logic
|
||||
permission_filter = or_(
|
||||
# User is the creator of the list
|
||||
and_(ListModel.created_by_id == user_id, ListModel.group_id.is_(None)),
|
||||
# List belongs to a group the user is a member of
|
||||
ListModel.group_id.in_(user_group_ids)
|
||||
)
|
||||
|
||||
# Main query to get list data and item counts
|
||||
query = (
|
||||
select(
|
||||
ListModel.id,
|
||||
@ -340,11 +345,7 @@ async def get_lists_statuses_by_ids(db: AsyncSession, list_ids: PyList[int], use
|
||||
|
||||
result = await db.execute(query)
|
||||
|
||||
# The result will be rows of (id, updated_at, item_count).
|
||||
# We need to verify that all requested list_ids that the user *should* have access to are present.
|
||||
# The filter in the query already handles permissions.
|
||||
|
||||
return result.all() # Returns a list of Row objects with id, updated_at, item_count
|
||||
return result.all()
|
||||
|
||||
except OperationalError as e:
|
||||
raise DatabaseConnectionError(f"Failed to connect to database: {str(e)}")
|
||||
|
103
be/app/crud/schedule.py
Normal file
103
be/app/crud/schedule.py
Normal file
@ -0,0 +1,103 @@
|
||||
import logging
|
||||
from datetime import date, timedelta
|
||||
from typing import List
|
||||
from itertools import cycle
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from app.models import Chore, ChoreAssignment, UserGroup, ChoreTypeEnum, ChoreHistoryEventTypeEnum
|
||||
from app.crud.group import get_group_by_id
|
||||
from app.crud.history import create_chore_history_entry
|
||||
from app.core.exceptions import GroupNotFoundError, ChoreOperationError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
async def generate_group_chore_schedule(
|
||||
db: AsyncSession,
|
||||
*,
|
||||
group_id: int,
|
||||
start_date: date,
|
||||
end_date: date,
|
||||
user_id: int,
|
||||
member_ids: List[int] = None
|
||||
) -> List[ChoreAssignment]:
|
||||
"""
|
||||
Generates a round-robin chore schedule for all group chores within a date range.
|
||||
"""
|
||||
if start_date > end_date:
|
||||
raise ChoreOperationError("Start date cannot be after end date.")
|
||||
|
||||
group = await get_group_by_id(db, group_id)
|
||||
if not group:
|
||||
raise GroupNotFoundError(group_id)
|
||||
|
||||
if not member_ids:
|
||||
members_result = await db.execute(
|
||||
select(UserGroup.user_id).where(UserGroup.group_id == group_id)
|
||||
)
|
||||
member_ids = members_result.scalars().all()
|
||||
|
||||
if not member_ids:
|
||||
raise ChoreOperationError("Cannot generate schedule with no members.")
|
||||
|
||||
chores_result = await db.execute(
|
||||
select(Chore).where(Chore.group_id == group_id, Chore.type == ChoreTypeEnum.group)
|
||||
)
|
||||
group_chores = chores_result.scalars().all()
|
||||
if not group_chores:
|
||||
logger.info(f"No chores found in group {group_id} to generate a schedule for.")
|
||||
return []
|
||||
|
||||
member_cycle = cycle(member_ids)
|
||||
new_assignments = []
|
||||
|
||||
current_date = start_date
|
||||
while current_date <= end_date:
|
||||
for chore in group_chores:
|
||||
if start_date <= chore.next_due_date <= end_date:
|
||||
existing_assignment_result = await db.execute(
|
||||
select(ChoreAssignment.id)
|
||||
.where(ChoreAssignment.chore_id == chore.id, ChoreAssignment.due_date == chore.next_due_date)
|
||||
.limit(1)
|
||||
)
|
||||
if existing_assignment_result.scalar_one_or_none():
|
||||
logger.info(f"Skipping assignment for chore '{chore.name}' on {chore.next_due_date} as it already exists.")
|
||||
continue
|
||||
|
||||
assigned_to_user_id = next(member_cycle)
|
||||
|
||||
assignment = ChoreAssignment(
|
||||
chore_id=chore.id,
|
||||
assigned_to_user_id=assigned_to_user_id,
|
||||
due_date=chore.next_due_date,
|
||||
is_complete=False
|
||||
)
|
||||
db.add(assignment)
|
||||
new_assignments.append(assignment)
|
||||
logger.info(f"Created assignment for chore '{chore.name}' to user {assigned_to_user_id} on {chore.next_due_date}")
|
||||
|
||||
current_date += timedelta(days=1)
|
||||
|
||||
if not new_assignments:
|
||||
logger.info(f"No new assignments were generated for group {group_id} in the specified date range.")
|
||||
return []
|
||||
|
||||
await create_chore_history_entry(
|
||||
db,
|
||||
chore_id=None,
|
||||
group_id=group_id,
|
||||
changed_by_user_id=user_id,
|
||||
event_type=ChoreHistoryEventTypeEnum.SCHEDULE_GENERATED,
|
||||
event_data={
|
||||
"start_date": start_date.isoformat(),
|
||||
"end_date": end_date.isoformat(),
|
||||
"member_ids": member_ids,
|
||||
"assignments_created": len(new_assignments)
|
||||
}
|
||||
)
|
||||
|
||||
await db.flush()
|
||||
|
||||
for assign in new_assignments:
|
||||
await db.refresh(assign)
|
||||
|
||||
return new_assignments
|
@ -1,4 +1,3 @@
|
||||
# app/crud/settlement.py
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload, joinedload
|
||||
@ -7,7 +6,7 @@ from sqlalchemy.exc import SQLAlchemyError, OperationalError, IntegrityError
|
||||
from decimal import Decimal, ROUND_HALF_UP
|
||||
from typing import List as PyList, Optional, Sequence
|
||||
from datetime import datetime, timezone
|
||||
import logging # Add logging import
|
||||
import logging
|
||||
|
||||
from app.models import (
|
||||
Settlement as SettlementModel,
|
||||
@ -27,8 +26,9 @@ from app.core.exceptions import (
|
||||
SettlementOperationError,
|
||||
ConflictError
|
||||
)
|
||||
from app.crud.audit import create_financial_audit_log
|
||||
|
||||
logger = logging.getLogger(__name__) # Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
async def create_settlement(db: AsyncSession, settlement_in: SettlementCreate, current_user_id: int) -> SettlementModel:
|
||||
"""Creates a new settlement record."""
|
||||
@ -49,13 +49,6 @@ async def create_settlement(db: AsyncSession, settlement_in: SettlementCreate, c
|
||||
if not group:
|
||||
raise GroupNotFoundError(settlement_in.group_id)
|
||||
|
||||
# Permission check example (can be in API layer too)
|
||||
# if current_user_id not in [payer.id, payee.id]:
|
||||
# is_member_stmt = select(UserGroupModel.id).where(UserGroupModel.group_id == group.id, UserGroupModel.user_id == current_user_id).limit(1)
|
||||
# is_member_result = await db.execute(is_member_stmt)
|
||||
# if not is_member_result.scalar_one_or_none():
|
||||
# raise InvalidOperationError("Settlement recorder must be part of the group or one of the parties.")
|
||||
|
||||
db_settlement = SettlementModel(
|
||||
group_id=settlement_in.group_id,
|
||||
paid_by_user_id=settlement_in.paid_by_user_id,
|
||||
@ -68,7 +61,6 @@ async def create_settlement(db: AsyncSession, settlement_in: SettlementCreate, c
|
||||
db.add(db_settlement)
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships
|
||||
stmt = (
|
||||
select(SettlementModel)
|
||||
.where(SettlementModel.id == db_settlement.id)
|
||||
@ -85,10 +77,15 @@ async def create_settlement(db: AsyncSession, settlement_in: SettlementCreate, c
|
||||
if loaded_settlement is None:
|
||||
raise SettlementOperationError("Failed to load settlement after creation.")
|
||||
|
||||
await create_financial_audit_log(
|
||||
db=db,
|
||||
user_id=current_user_id,
|
||||
action_type="SETTLEMENT_CREATED",
|
||||
entity=loaded_settlement,
|
||||
)
|
||||
|
||||
return loaded_settlement
|
||||
except (UserNotFoundError, GroupNotFoundError, InvalidOperationError) as e:
|
||||
# These are validation errors, re-raise them.
|
||||
# If a transaction was started, context manager handles rollback.
|
||||
raise
|
||||
except IntegrityError as e:
|
||||
logger.error(f"Database integrity error during settlement creation: {str(e)}", exc_info=True)
|
||||
@ -115,10 +112,8 @@ async def get_settlement_by_id(db: AsyncSession, settlement_id: int) -> Optional
|
||||
)
|
||||
return result.scalars().first()
|
||||
except OperationalError as e:
|
||||
# Optional: logger.warning or info if needed for read operations
|
||||
raise DatabaseConnectionError(f"DB connection error fetching settlement: {str(e)}")
|
||||
except SQLAlchemyError as e:
|
||||
# Optional: logger.warning or info if needed for read operations
|
||||
raise DatabaseQueryError(f"DB query error fetching settlement: {str(e)}")
|
||||
|
||||
async def get_settlements_for_group(db: AsyncSession, group_id: int, skip: int = 0, limit: int = 100) -> Sequence[SettlementModel]:
|
||||
@ -173,7 +168,7 @@ async def get_settlements_involving_user(
|
||||
raise DatabaseQueryError(f"DB query error fetching user settlements: {str(e)}")
|
||||
|
||||
|
||||
async def update_settlement(db: AsyncSession, settlement_db: SettlementModel, settlement_in: SettlementUpdate) -> SettlementModel:
|
||||
async def update_settlement(db: AsyncSession, settlement_db: SettlementModel, settlement_in: SettlementUpdate, current_user_id: int) -> SettlementModel:
|
||||
"""
|
||||
Updates an existing settlement.
|
||||
Only allows updates to description and settlement_date.
|
||||
@ -183,10 +178,6 @@ async def update_settlement(db: AsyncSession, settlement_db: SettlementModel, se
|
||||
"""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
# Ensure the settlement_db passed is managed by the current session if not already.
|
||||
# This is usually true if fetched by an endpoint dependency using the same session.
|
||||
# If not, `db.add(settlement_db)` might be needed before modification if it's detached.
|
||||
|
||||
if not hasattr(settlement_db, 'version') or not hasattr(settlement_in, 'version'):
|
||||
raise InvalidOperationError("Version field is missing in model or input for optimistic locking.")
|
||||
|
||||
@ -196,6 +187,11 @@ async def update_settlement(db: AsyncSession, settlement_db: SettlementModel, se
|
||||
f"Your version {settlement_in.version} does not match current version {settlement_db.version}. Please refresh."
|
||||
)
|
||||
|
||||
before_state = {c.name: getattr(settlement_db, c.name) for c in settlement_db.__table__.columns if c.name in settlement_in.dict(exclude_unset=True)}
|
||||
for k, v in before_state.items():
|
||||
if isinstance(v, (datetime, Decimal)):
|
||||
before_state[k] = str(v)
|
||||
|
||||
update_data = settlement_in.model_dump(exclude_unset=True, exclude={"version"})
|
||||
allowed_to_update = {"description", "settlement_date"}
|
||||
updated_something = False
|
||||
@ -204,22 +200,14 @@ async def update_settlement(db: AsyncSession, settlement_db: SettlementModel, se
|
||||
if field in allowed_to_update:
|
||||
setattr(settlement_db, field, value)
|
||||
updated_something = True
|
||||
# Silently ignore fields not allowed to update or raise error:
|
||||
# else:
|
||||
# raise InvalidOperationError(f"Field '{field}' cannot be updated.")
|
||||
|
||||
if not updated_something and not settlement_in.model_fields_set.intersection(allowed_to_update):
|
||||
# No updatable fields were actually provided, or they didn't change
|
||||
# Still, we might want to return the re-loaded settlement if version matched.
|
||||
pass
|
||||
|
||||
settlement_db.version += 1
|
||||
settlement_db.updated_at = datetime.now(timezone.utc) # Ensure model has this field
|
||||
|
||||
db.add(settlement_db) # Mark as dirty
|
||||
settlement_db.updated_at = datetime.now(timezone.utc)
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships
|
||||
stmt = (
|
||||
select(SettlementModel)
|
||||
.where(SettlementModel.id == settlement_db.id)
|
||||
@ -233,11 +221,24 @@ async def update_settlement(db: AsyncSession, settlement_db: SettlementModel, se
|
||||
result = await db.execute(stmt)
|
||||
updated_settlement = result.scalar_one_or_none()
|
||||
|
||||
if updated_settlement is None: # Should not happen
|
||||
if updated_settlement is None:
|
||||
raise SettlementOperationError("Failed to load settlement after update.")
|
||||
|
||||
after_state = {c.name: getattr(updated_settlement, c.name) for c in updated_settlement.__table__.columns if c.name in update_data}
|
||||
for k, v in after_state.items():
|
||||
if isinstance(v, (datetime, Decimal)):
|
||||
after_state[k] = str(v)
|
||||
|
||||
await create_financial_audit_log(
|
||||
db=db,
|
||||
user_id=current_user_id,
|
||||
action_type="SETTLEMENT_UPDATED",
|
||||
entity=updated_settlement,
|
||||
details={"before": before_state, "after": after_state}
|
||||
)
|
||||
|
||||
return updated_settlement
|
||||
except ConflictError as e: # ConflictError should be defined in exceptions
|
||||
except ConflictError as e:
|
||||
raise
|
||||
except InvalidOperationError as e:
|
||||
raise
|
||||
@ -252,7 +253,7 @@ async def update_settlement(db: AsyncSession, settlement_db: SettlementModel, se
|
||||
raise DatabaseTransactionError(f"DB transaction error during settlement update: {str(e)}")
|
||||
|
||||
|
||||
async def delete_settlement(db: AsyncSession, settlement_db: SettlementModel, expected_version: Optional[int] = None) -> None:
|
||||
async def delete_settlement(db: AsyncSession, settlement_db: SettlementModel, current_user_id: int, expected_version: Optional[int] = None) -> None:
|
||||
"""
|
||||
Deletes a settlement. Requires version matching if expected_version is provided.
|
||||
Assumes SettlementModel has a version field.
|
||||
@ -261,13 +262,26 @@ async def delete_settlement(db: AsyncSession, settlement_db: SettlementModel, ex
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin():
|
||||
if expected_version is not None:
|
||||
if not hasattr(settlement_db, 'version') or settlement_db.version != expected_version:
|
||||
raise ConflictError( # Make sure ConflictError is defined
|
||||
raise ConflictError(
|
||||
f"Settlement (ID: {settlement_db.id}) cannot be deleted. "
|
||||
f"Expected version {expected_version} does not match current version {settlement_db.version}. Please refresh."
|
||||
)
|
||||
|
||||
details = {c.name: getattr(settlement_db, c.name) for c in settlement_db.__table__.columns}
|
||||
for k, v in details.items():
|
||||
if isinstance(v, (datetime, Decimal)):
|
||||
details[k] = str(v)
|
||||
|
||||
await create_financial_audit_log(
|
||||
db=db,
|
||||
user_id=current_user_id,
|
||||
action_type="SETTLEMENT_DELETED",
|
||||
entity=settlement_db,
|
||||
details=details
|
||||
)
|
||||
|
||||
await db.delete(settlement_db)
|
||||
except ConflictError as e: # ConflictError should be defined
|
||||
except ConflictError as e:
|
||||
raise
|
||||
except OperationalError as e:
|
||||
logger.error(f"Database connection error during settlement deletion: {str(e)}", exc_info=True)
|
||||
@ -275,7 +289,3 @@ async def delete_settlement(db: AsyncSession, settlement_db: SettlementModel, ex
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Unexpected SQLAlchemy error during settlement deletion: {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"DB transaction error during settlement deletion: {str(e)}")
|
||||
|
||||
# Ensure SettlementOperationError and ConflictError are defined in app.core.exceptions
|
||||
# Example: class SettlementOperationError(AppException): pass
|
||||
# Example: class ConflictError(AppException): status_code = 409
|
@ -14,9 +14,10 @@ from app.models import (
|
||||
ExpenseSplitStatusEnum,
|
||||
ExpenseOverallStatusEnum,
|
||||
)
|
||||
# Placeholder for Pydantic schema - actual schema definition is a later step
|
||||
# from app.schemas.settlement_activity import SettlementActivityCreate # Assuming this path
|
||||
from pydantic import BaseModel # Using pydantic BaseModel directly for the placeholder
|
||||
from pydantic import BaseModel
|
||||
from app.crud.audit import create_financial_audit_log
|
||||
from app.schemas.settlement_activity import SettlementActivityCreate
|
||||
from app.core.exceptions import UserNotFoundError, InvalidOperationError, FinancialConflictError, OverpaymentError
|
||||
|
||||
|
||||
class SettlementActivityCreatePlaceholder(BaseModel):
|
||||
@ -26,8 +27,7 @@ class SettlementActivityCreatePlaceholder(BaseModel):
|
||||
paid_at: Optional[datetime] = None
|
||||
|
||||
class Config:
|
||||
orm_mode = True # Pydantic V1 style orm_mode
|
||||
# from_attributes = True # Pydantic V2 style
|
||||
orm_mode = True
|
||||
|
||||
|
||||
async def update_expense_split_status(db: AsyncSession, expense_split_id: int) -> Optional[ExpenseSplit]:
|
||||
@ -35,7 +35,6 @@ async def update_expense_split_status(db: AsyncSession, expense_split_id: int) -
|
||||
Updates the status of an ExpenseSplit based on its settlement activities.
|
||||
Also updates the overall status of the parent Expense.
|
||||
"""
|
||||
# Fetch the ExpenseSplit with its related settlement_activities and the parent expense
|
||||
result = await db.execute(
|
||||
select(ExpenseSplit)
|
||||
.options(
|
||||
@ -47,18 +46,13 @@ async def update_expense_split_status(db: AsyncSession, expense_split_id: int) -
|
||||
expense_split = result.scalar_one_or_none()
|
||||
|
||||
if not expense_split:
|
||||
# Or raise an exception, depending on desired error handling
|
||||
return None
|
||||
|
||||
# Calculate total_paid from all settlement_activities for that split
|
||||
total_paid = sum(activity.amount_paid for activity in expense_split.settlement_activities)
|
||||
total_paid = Decimal(total_paid).quantize(Decimal("0.01")) # Ensure two decimal places
|
||||
total_paid = Decimal(total_paid).quantize(Decimal("0.01"))
|
||||
|
||||
# Compare total_paid with ExpenseSplit.owed_amount
|
||||
if total_paid >= expense_split.owed_amount:
|
||||
expense_split.status = ExpenseSplitStatusEnum.paid
|
||||
# Set paid_at to the latest relevant SettlementActivity or current time
|
||||
# For simplicity, let's find the latest paid_at from activities, or use now()
|
||||
latest_paid_at = None
|
||||
if expense_split.settlement_activities:
|
||||
latest_paid_at = max(act.paid_at for act in expense_split.settlement_activities if act.paid_at)
|
||||
@ -66,13 +60,13 @@ async def update_expense_split_status(db: AsyncSession, expense_split_id: int) -
|
||||
expense_split.paid_at = latest_paid_at if latest_paid_at else datetime.now(timezone.utc)
|
||||
elif total_paid > 0:
|
||||
expense_split.status = ExpenseSplitStatusEnum.partially_paid
|
||||
expense_split.paid_at = None # Clear paid_at if not fully paid
|
||||
expense_split.paid_at = None
|
||||
else: # total_paid == 0
|
||||
expense_split.status = ExpenseSplitStatusEnum.unpaid
|
||||
expense_split.paid_at = None # Clear paid_at
|
||||
expense_split.paid_at = None
|
||||
|
||||
await db.flush()
|
||||
await db.refresh(expense_split, attribute_names=['status', 'paid_at', 'expense']) # Refresh to get updated data and related expense
|
||||
await db.refresh(expense_split, attribute_names=['status', 'paid_at', 'expense'])
|
||||
|
||||
return expense_split
|
||||
|
||||
@ -81,18 +75,16 @@ async def update_expense_overall_status(db: AsyncSession, expense_id: int) -> Op
|
||||
"""
|
||||
Updates the overall_status of an Expense based on the status of its splits.
|
||||
"""
|
||||
# Fetch the Expense with its related splits
|
||||
result = await db.execute(
|
||||
select(Expense).options(selectinload(Expense.splits)).where(Expense.id == expense_id)
|
||||
)
|
||||
expense = result.scalar_one_or_none()
|
||||
|
||||
if not expense:
|
||||
# Or raise an exception
|
||||
return None
|
||||
|
||||
if not expense.splits: # No splits, should not happen for a valid expense but handle defensively
|
||||
expense.overall_settlement_status = ExpenseOverallStatusEnum.unpaid # Or some other default/error state
|
||||
if not expense.splits:
|
||||
expense.overall_settlement_status = ExpenseOverallStatusEnum.unpaid
|
||||
await db.flush()
|
||||
await db.refresh(expense)
|
||||
return expense
|
||||
@ -107,14 +99,14 @@ async def update_expense_overall_status(db: AsyncSession, expense_id: int) -> Op
|
||||
num_paid_splits += 1
|
||||
elif split.status == ExpenseSplitStatusEnum.partially_paid:
|
||||
num_partially_paid_splits += 1
|
||||
else: # unpaid
|
||||
else:
|
||||
num_unpaid_splits += 1
|
||||
|
||||
if num_paid_splits == num_splits:
|
||||
expense.overall_settlement_status = ExpenseOverallStatusEnum.paid
|
||||
elif num_unpaid_splits == num_splits:
|
||||
expense.overall_settlement_status = ExpenseOverallStatusEnum.unpaid
|
||||
else: # Mix of paid, partially_paid, or unpaid but not all unpaid/paid
|
||||
else:
|
||||
expense.overall_settlement_status = ExpenseOverallStatusEnum.partially_paid
|
||||
|
||||
await db.flush()
|
||||
@ -124,51 +116,96 @@ async def update_expense_overall_status(db: AsyncSession, expense_id: int) -> Op
|
||||
|
||||
async def create_settlement_activity(
|
||||
db: AsyncSession,
|
||||
settlement_activity_in: SettlementActivityCreatePlaceholder,
|
||||
settlement_activity_in: SettlementActivityCreate,
|
||||
current_user_id: int
|
||||
) -> Optional[SettlementActivity]:
|
||||
) -> SettlementActivity:
|
||||
"""
|
||||
Creates a new settlement activity, then updates the parent expense split and expense statuses.
|
||||
Uses pessimistic locking on the ExpenseSplit row to prevent race conditions.
|
||||
Relies on the calling context (e.g., transactional session dependency) for the transaction.
|
||||
"""
|
||||
# Validate ExpenseSplit
|
||||
split_result = await db.execute(select(ExpenseSplit).where(ExpenseSplit.id == settlement_activity_in.expense_split_id))
|
||||
# Lock the expense split row for the duration of the transaction
|
||||
split_stmt = (
|
||||
select(ExpenseSplit)
|
||||
.where(ExpenseSplit.id == settlement_activity_in.expense_split_id)
|
||||
.with_for_update()
|
||||
)
|
||||
split_result = await db.execute(split_stmt)
|
||||
expense_split = split_result.scalar_one_or_none()
|
||||
|
||||
if not expense_split:
|
||||
# Consider raising an HTTPException in an API layer
|
||||
return None # ExpenseSplit not found
|
||||
raise InvalidOperationError(f"Expense split with ID {settlement_activity_in.expense_split_id} not found.")
|
||||
|
||||
# Validate User (paid_by_user_id)
|
||||
# Check if the split is already fully paid
|
||||
if expense_split.status == ExpenseSplitStatusEnum.paid:
|
||||
raise FinancialConflictError(f"Expense split {expense_split.id} is already fully paid.")
|
||||
|
||||
# Calculate current total paid to prevent overpayment
|
||||
current_total_paid = Decimal("0.00")
|
||||
if expense_split.settlement_activities:
|
||||
current_total_paid = sum(
|
||||
Decimal(str(activity.amount_paid)) for activity in expense_split.settlement_activities
|
||||
)
|
||||
current_total_paid = current_total_paid.quantize(Decimal("0.01"))
|
||||
|
||||
new_payment_amount = Decimal(str(settlement_activity_in.amount_paid)).quantize(Decimal("0.01"))
|
||||
projected_total = current_total_paid + new_payment_amount
|
||||
owed_amount = Decimal(str(expense_split.owed_amount)).quantize(Decimal("0.01"))
|
||||
|
||||
# Prevent overpayment (with small epsilon for floating point precision)
|
||||
epsilon = Decimal("0.01")
|
||||
if projected_total > (owed_amount + epsilon):
|
||||
remaining_amount = owed_amount - current_total_paid
|
||||
raise OverpaymentError(
|
||||
f"Payment amount {new_payment_amount} would exceed remaining owed amount. "
|
||||
f"Maximum payment allowed: {remaining_amount} (owed: {owed_amount}, already paid: {current_total_paid})"
|
||||
)
|
||||
|
||||
# Validate that the user paying exists
|
||||
user_result = await db.execute(select(User).where(User.id == settlement_activity_in.paid_by_user_id))
|
||||
paid_by_user = user_result.scalar_one_or_none()
|
||||
if not paid_by_user:
|
||||
return None # User not found
|
||||
if not user_result.scalar_one_or_none():
|
||||
raise UserNotFoundError(user_id=settlement_activity_in.paid_by_user_id)
|
||||
|
||||
# Create SettlementActivity instance
|
||||
db_settlement_activity = SettlementActivity(
|
||||
expense_split_id=settlement_activity_in.expense_split_id,
|
||||
paid_by_user_id=settlement_activity_in.paid_by_user_id,
|
||||
amount_paid=settlement_activity_in.amount_paid,
|
||||
paid_at=settlement_activity_in.paid_at if settlement_activity_in.paid_at else datetime.now(timezone.utc),
|
||||
created_by_user_id=current_user_id # The user recording the activity
|
||||
created_by_user_id=current_user_id
|
||||
)
|
||||
|
||||
db.add(db_settlement_activity)
|
||||
await db.flush() # Flush to get the ID for db_settlement_activity
|
||||
await db.flush()
|
||||
|
||||
# Update statuses
|
||||
await create_financial_audit_log(
|
||||
db=db,
|
||||
user_id=current_user_id,
|
||||
action_type="SETTLEMENT_ACTIVITY_CREATED",
|
||||
entity=db_settlement_activity,
|
||||
)
|
||||
|
||||
# Update statuses within the same transaction
|
||||
updated_split = await update_expense_split_status(db, expense_split_id=db_settlement_activity.expense_split_id)
|
||||
if updated_split and updated_split.expense_id:
|
||||
await update_expense_overall_status(db, expense_id=updated_split.expense_id)
|
||||
else:
|
||||
# This case implies update_expense_split_status returned None or expense_id was missing.
|
||||
# This could be a problem, consider logging or raising an error.
|
||||
# For now, the transaction would roll back if an exception is raised.
|
||||
# If not raising, the overall status update might be skipped.
|
||||
pass # Or handle error
|
||||
|
||||
await db.refresh(db_settlement_activity, attribute_names=['split', 'payer', 'creator']) # Refresh to load relationships
|
||||
# Re-fetch the object with all relationships loaded to prevent lazy-loading issues during serialization
|
||||
stmt = (
|
||||
select(SettlementActivity)
|
||||
.where(SettlementActivity.id == db_settlement_activity.id)
|
||||
.options(
|
||||
selectinload(SettlementActivity.payer),
|
||||
selectinload(SettlementActivity.creator)
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
loaded_activity = result.scalar_one_or_none()
|
||||
|
||||
return db_settlement_activity
|
||||
if not loaded_activity:
|
||||
# This should not happen in a normal flow
|
||||
raise InvalidOperationError("Failed to load settlement activity after creation.")
|
||||
|
||||
return loaded_activity
|
||||
|
||||
|
||||
async def get_settlement_activity_by_id(
|
||||
@ -180,9 +217,9 @@ async def get_settlement_activity_by_id(
|
||||
result = await db.execute(
|
||||
select(SettlementActivity)
|
||||
.options(
|
||||
selectinload(SettlementActivity.split).selectinload(ExpenseSplit.expense), # Load split and its parent expense
|
||||
selectinload(SettlementActivity.payer), # Load the user who paid
|
||||
selectinload(SettlementActivity.creator) # Load the user who created the record
|
||||
selectinload(SettlementActivity.split).selectinload(ExpenseSplit.expense),
|
||||
selectinload(SettlementActivity.payer),
|
||||
selectinload(SettlementActivity.creator)
|
||||
)
|
||||
.where(SettlementActivity.id == settlement_activity_id)
|
||||
)
|
||||
@ -199,8 +236,8 @@ async def get_settlement_activities_for_split(
|
||||
select(SettlementActivity)
|
||||
.where(SettlementActivity.expense_split_id == expense_split_id)
|
||||
.options(
|
||||
selectinload(SettlementActivity.payer), # Load the user who paid
|
||||
selectinload(SettlementActivity.creator) # Load the user who created the record
|
||||
selectinload(SettlementActivity.payer),
|
||||
selectinload(SettlementActivity.creator)
|
||||
)
|
||||
.order_by(SettlementActivity.paid_at.desc(), SettlementActivity.created_at.desc())
|
||||
.offset(skip)
|
||||
|
@ -1,12 +1,11 @@
|
||||
# app/crud/user.py
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from sqlalchemy.orm import selectinload # Ensure selectinload is imported
|
||||
from sqlalchemy.orm import selectinload
|
||||
from sqlalchemy.exc import SQLAlchemyError, IntegrityError, OperationalError
|
||||
from typing import Optional
|
||||
import logging # Add logging import
|
||||
import logging
|
||||
|
||||
from app.models import User as UserModel, UserGroup as UserGroupModel, Group as GroupModel # Import related models for selectinload
|
||||
from app.models import User as UserModel, UserGroup as UserGroupModel
|
||||
from app.schemas.user import UserCreate
|
||||
from app.core.security import hash_password
|
||||
from app.core.exceptions import (
|
||||
@ -16,23 +15,19 @@ from app.core.exceptions import (
|
||||
DatabaseIntegrityError,
|
||||
DatabaseQueryError,
|
||||
DatabaseTransactionError,
|
||||
UserOperationError # Add if specific user operation errors are needed
|
||||
UserOperationError
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__) # Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
async def get_user_by_email(db: AsyncSession, email: str) -> Optional[UserModel]:
|
||||
"""Fetches a user from the database by email, with common relationships."""
|
||||
try:
|
||||
# db.begin() is not strictly necessary for a single read, but ensures atomicity if multiple reads were added.
|
||||
# For a single select, it can be omitted if preferred, session handles connection.
|
||||
stmt = (
|
||||
select(UserModel)
|
||||
.filter(UserModel.email == email)
|
||||
.options(
|
||||
selectinload(UserModel.group_associations).selectinload(UserGroupModel.group), # Groups user is member of
|
||||
selectinload(UserModel.created_groups) # Groups user created
|
||||
# Add other relationships as needed by UserPublic schema
|
||||
selectinload(UserModel.group_associations).selectinload(UserGroupModel.group),
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
@ -44,34 +39,33 @@ async def get_user_by_email(db: AsyncSession, email: str) -> Optional[UserModel]
|
||||
logger.error(f"Unexpected SQLAlchemy error while fetching user by email '{email}': {str(e)}", exc_info=True)
|
||||
raise DatabaseQueryError(f"Failed to query user: {str(e)}")
|
||||
|
||||
async def create_user(db: AsyncSession, user_in: UserCreate) -> UserModel:
|
||||
async def create_user(db: AsyncSession, user_in: UserCreate, is_guest: bool = False) -> UserModel:
|
||||
"""Creates a new user record in the database with common relationships loaded."""
|
||||
try:
|
||||
async with db.begin_nested() if db.in_transaction() else db.begin() as transaction:
|
||||
_hashed_password = hash_password(user_in.password)
|
||||
db_user = UserModel(
|
||||
email=user_in.email,
|
||||
hashed_password=_hashed_password, # Field name in model is hashed_password
|
||||
name=user_in.name
|
||||
hashed_password=_hashed_password,
|
||||
name=user_in.name,
|
||||
is_guest=is_guest
|
||||
)
|
||||
db.add(db_user)
|
||||
await db.flush() # Flush to get DB-generated values like ID
|
||||
await db.flush()
|
||||
|
||||
# Re-fetch with relationships
|
||||
stmt = (
|
||||
select(UserModel)
|
||||
.where(UserModel.id == db_user.id)
|
||||
.options(
|
||||
selectinload(UserModel.group_associations).selectinload(UserGroupModel.group),
|
||||
selectinload(UserModel.created_groups)
|
||||
# Add other relationships as needed by UserPublic schema
|
||||
)
|
||||
)
|
||||
result = await db.execute(stmt)
|
||||
loaded_user = result.scalar_one_or_none()
|
||||
|
||||
if loaded_user is None:
|
||||
raise UserOperationError("Failed to load user after creation.") # Define UserOperationError
|
||||
raise UserOperationError("Failed to load user after creation.")
|
||||
|
||||
return loaded_user
|
||||
except IntegrityError as e:
|
||||
@ -85,6 +79,3 @@ async def create_user(db: AsyncSession, user_in: UserCreate) -> UserModel:
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Unexpected SQLAlchemy error during user creation for email '{user_in.email}': {str(e)}", exc_info=True)
|
||||
raise DatabaseTransactionError(f"Failed to create user due to other DB error: {str(e)}")
|
||||
|
||||
# Ensure UserOperationError is defined in app.core.exceptions if used
|
||||
# Example: class UserOperationError(AppException): pass
|
@ -1,24 +1,18 @@
|
||||
# app/database.py
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
|
||||
from sqlalchemy.orm import sessionmaker, declarative_base
|
||||
from app.config import settings
|
||||
|
||||
# Ensure DATABASE_URL is set before proceeding
|
||||
if not settings.DATABASE_URL:
|
||||
raise ValueError("DATABASE_URL is not configured in settings.")
|
||||
|
||||
# Create the SQLAlchemy async engine
|
||||
# pool_recycle=3600 helps prevent stale connections on some DBs
|
||||
engine = create_async_engine(
|
||||
settings.DATABASE_URL,
|
||||
echo=False, # Disable SQL query logging for production (use DEBUG log level to enable)
|
||||
future=True, # Use SQLAlchemy 2.0 style features
|
||||
pool_recycle=3600, # Optional: recycle connections after 1 hour
|
||||
pool_pre_ping=True # Add this line to ensure connections are live
|
||||
echo=False,
|
||||
future=True,
|
||||
pool_recycle=3600,
|
||||
pool_pre_ping=True
|
||||
)
|
||||
|
||||
# Create a configured "Session" class
|
||||
# expire_on_commit=False prevents attributes from expiring after commit
|
||||
AsyncSessionLocal = sessionmaker(
|
||||
bind=engine,
|
||||
class_=AsyncSession,
|
||||
@ -27,10 +21,8 @@ AsyncSessionLocal = sessionmaker(
|
||||
autocommit=False,
|
||||
)
|
||||
|
||||
# Base class for our ORM models
|
||||
Base = declarative_base()
|
||||
|
||||
# Dependency to get DB session in path operations
|
||||
async def get_session() -> AsyncSession: # type: ignore
|
||||
"""
|
||||
Dependency function that yields an AsyncSession for read-only operations.
|
||||
@ -38,7 +30,6 @@ async def get_session() -> AsyncSession: # type: ignore
|
||||
"""
|
||||
async with AsyncSessionLocal() as session:
|
||||
yield session
|
||||
# The 'async with' block handles session.close() automatically.
|
||||
|
||||
async def get_transactional_session() -> AsyncSession: # type: ignore
|
||||
"""
|
||||
@ -51,7 +42,5 @@ async def get_transactional_session() -> AsyncSession: # type: ignore
|
||||
async with AsyncSessionLocal() as session:
|
||||
async with session.begin():
|
||||
yield session
|
||||
# Transaction is automatically committed on success or rolled back on exception
|
||||
|
||||
# Alias for backward compatibility
|
||||
get_db = get_session
|
@ -1,4 +1,2 @@
|
||||
from app.database import AsyncSessionLocal
|
||||
|
||||
# Export the async session factory
|
||||
async_session = AsyncSessionLocal
|
@ -1,11 +1,13 @@
|
||||
from datetime import datetime, timedelta
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, and_
|
||||
from app.models import Expense, RecurrencePattern
|
||||
from sqlalchemy.orm import selectinload
|
||||
from app.models import Expense, RecurrencePattern, SplitTypeEnum
|
||||
from app.crud.expense import create_expense
|
||||
from app.schemas.expense import ExpenseCreate
|
||||
from app.schemas.expense import ExpenseCreate, ExpenseSplitCreate
|
||||
import logging
|
||||
from typing import Optional
|
||||
import enum
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -15,23 +17,22 @@ async def generate_recurring_expenses(db: AsyncSession) -> None:
|
||||
Should be run daily to check for and create new recurring expenses.
|
||||
"""
|
||||
try:
|
||||
# Get all active recurring expenses that need to be generated
|
||||
now = datetime.utcnow()
|
||||
query = select(Expense).join(RecurrencePattern).where(
|
||||
and_(
|
||||
Expense.is_recurring == True,
|
||||
Expense.next_occurrence <= now,
|
||||
# Check if we haven't reached max occurrences
|
||||
(
|
||||
(RecurrencePattern.max_occurrences == None) |
|
||||
(RecurrencePattern.max_occurrences > 0)
|
||||
),
|
||||
# Check if we haven't reached end date
|
||||
(
|
||||
(RecurrencePattern.end_date == None) |
|
||||
(RecurrencePattern.end_date > now)
|
||||
)
|
||||
)
|
||||
).options(
|
||||
selectinload(Expense.splits) # Eager load splits to use as a template
|
||||
)
|
||||
|
||||
result = await db.execute(query)
|
||||
@ -40,26 +41,47 @@ async def generate_recurring_expenses(db: AsyncSession) -> None:
|
||||
for expense in recurring_expenses:
|
||||
try:
|
||||
await _generate_next_occurrence(db, expense)
|
||||
# Persist changes for this expense before moving to the next one
|
||||
await db.commit()
|
||||
except Exception as e:
|
||||
logger.error(f"Error generating next occurrence for expense {expense.id}: {str(e)}")
|
||||
logger.error(f"Error generating next occurrence for expense {expense.id}: {str(e)}", exc_info=True)
|
||||
await db.rollback()
|
||||
continue
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in generate_recurring_expenses job: {str(e)}")
|
||||
raise
|
||||
logger.error(f"Error in generate_recurring_expenses job during expense fetch: {str(e)}", exc_info=True)
|
||||
# Do not re-raise, allow the job scheduler to run again later
|
||||
|
||||
async def _generate_next_occurrence(db: AsyncSession, expense: Expense) -> None:
|
||||
"""Generate the next occurrence of a recurring expense."""
|
||||
pattern = expense.recurrence_pattern
|
||||
if not pattern:
|
||||
logger.warning(f"Recurring expense {expense.id} is missing its recurrence pattern.")
|
||||
return
|
||||
|
||||
# Calculate next occurrence date
|
||||
next_date = _calculate_next_occurrence(expense.next_occurrence, pattern)
|
||||
if not next_date:
|
||||
logger.info(f"No next occurrence date for expense {expense.id}, stopping recurrence.")
|
||||
expense.is_recurring = False # Stop future processing
|
||||
await db.flush()
|
||||
return
|
||||
|
||||
# Create new expense based on template
|
||||
# Recreate splits from the template expense if needed
|
||||
splits_data = None
|
||||
if expense.split_type not in [SplitTypeEnum.EQUAL, SplitTypeEnum.ITEM_BASED]:
|
||||
if not expense.splits:
|
||||
logger.error(f"Cannot generate next occurrence for expense {expense.id} with split type {expense.split_type.value} because it has no splits to use as a template.")
|
||||
return
|
||||
|
||||
splits_data = [
|
||||
ExpenseSplitCreate(
|
||||
user_id=split.user_id,
|
||||
owed_amount=split.owed_amount,
|
||||
share_percentage=split.share_percentage,
|
||||
share_units=split.share_units,
|
||||
) for split in expense.splits
|
||||
]
|
||||
|
||||
new_expense = ExpenseCreate(
|
||||
description=expense.description,
|
||||
total_amount=expense.total_amount,
|
||||
@ -70,50 +92,98 @@ async def _generate_next_occurrence(db: AsyncSession, expense: Expense) -> None:
|
||||
group_id=expense.group_id,
|
||||
item_id=expense.item_id,
|
||||
paid_by_user_id=expense.paid_by_user_id,
|
||||
is_recurring=False, # Generated expenses are not recurring
|
||||
splits_in=None # Will be generated based on split_type
|
||||
is_recurring=False, # The new expense is a single occurrence, not a recurring template
|
||||
splits_in=splits_data
|
||||
)
|
||||
|
||||
# Create the new expense
|
||||
# We pass the original creator's ID
|
||||
created_expense = await create_expense(db, new_expense, expense.created_by_user_id)
|
||||
logger.info(f"Generated new expense {created_expense.id} from recurring expense {expense.id}.")
|
||||
|
||||
# Update the original expense
|
||||
# Update the template expense for the next run
|
||||
expense.last_occurrence = next_date
|
||||
expense.next_occurrence = _calculate_next_occurrence(next_date, pattern)
|
||||
next_next_date = _calculate_next_occurrence(next_date, pattern)
|
||||
|
||||
if pattern.max_occurrences:
|
||||
# Decrement occurrence count if it exists
|
||||
if pattern.max_occurrences is not None:
|
||||
pattern.max_occurrences -= 1
|
||||
if pattern.max_occurrences <= 0:
|
||||
next_next_date = None # Stop recurrence
|
||||
|
||||
expense.next_occurrence = next_next_date
|
||||
if not expense.next_occurrence:
|
||||
expense.is_recurring = False # End the recurrence
|
||||
|
||||
await db.flush()
|
||||
|
||||
def _calculate_next_occurrence(current_date: datetime, pattern: RecurrencePattern) -> Optional[datetime]:
|
||||
"""Calculate the next occurrence date based on the pattern."""
|
||||
"""Calculate the next occurrence date based on the recurrence pattern provided."""
|
||||
|
||||
if not current_date:
|
||||
return None
|
||||
|
||||
if pattern.type == 'daily':
|
||||
return current_date + timedelta(days=pattern.interval)
|
||||
# Extract a lowercase string of the recurrence type regardless of whether it is an Enum member or a str.
|
||||
if isinstance(pattern.type, enum.Enum):
|
||||
pattern_type = pattern.type.value.lower()
|
||||
else:
|
||||
pattern_type = str(pattern.type).lower()
|
||||
|
||||
elif pattern.type == 'weekly':
|
||||
next_date: Optional[datetime] = None
|
||||
|
||||
if pattern_type == 'daily':
|
||||
next_date = current_date + timedelta(days=pattern.interval)
|
||||
|
||||
elif pattern_type == 'weekly':
|
||||
if not pattern.days_of_week:
|
||||
return current_date + timedelta(weeks=pattern.interval)
|
||||
|
||||
# Find next day of week
|
||||
next_date = current_date + timedelta(weeks=pattern.interval)
|
||||
else:
|
||||
current_weekday = current_date.weekday()
|
||||
next_weekday = min((d for d in pattern.days_of_week if d > current_weekday),
|
||||
default=min(pattern.days_of_week))
|
||||
days_ahead = next_weekday - current_weekday
|
||||
if days_ahead <= 0:
|
||||
days_ahead += 7
|
||||
return current_date + timedelta(days=days_ahead)
|
||||
# ``days_of_week`` can be stored either as a list[int] (Python-side) or as a
|
||||
# comma-separated string in the database. We normalise it to a list[int].
|
||||
days_of_week_iterable = []
|
||||
if pattern.days_of_week is None:
|
||||
days_of_week_iterable = []
|
||||
elif isinstance(pattern.days_of_week, (list, tuple)):
|
||||
days_of_week_iterable = list(pattern.days_of_week)
|
||||
else:
|
||||
# Assume comma-separated string like "1,3,5"
|
||||
try:
|
||||
days_of_week_iterable = [int(d.strip()) for d in str(pattern.days_of_week).split(',') if d.strip().isdigit()]
|
||||
except Exception:
|
||||
days_of_week_iterable = []
|
||||
|
||||
elif pattern.type == 'monthly':
|
||||
# Add months to current date
|
||||
# Find the next valid weekday after the current one
|
||||
next_days = sorted([d for d in days_of_week_iterable if d > current_weekday])
|
||||
if next_days:
|
||||
days_ahead = next_days[0] - current_weekday
|
||||
next_date = current_date + timedelta(days=days_ahead)
|
||||
else:
|
||||
# Jump to the first valid day in a future week respecting the interval
|
||||
if days_of_week_iterable:
|
||||
days_ahead = (7 - current_weekday) + min(days_of_week_iterable)
|
||||
next_date = current_date + timedelta(days=days_ahead)
|
||||
if pattern.interval > 1:
|
||||
next_date += timedelta(weeks=pattern.interval - 1)
|
||||
|
||||
elif pattern_type == 'monthly':
|
||||
# Move `interval` months forward while keeping the day component stable where possible.
|
||||
year = current_date.year + (current_date.month + pattern.interval - 1) // 12
|
||||
month = (current_date.month + pattern.interval - 1) % 12 + 1
|
||||
return current_date.replace(year=year, month=month)
|
||||
try:
|
||||
next_date = current_date.replace(year=year, month=month)
|
||||
except ValueError:
|
||||
# Handle cases like Feb-31st by rolling back to the last valid day of the new month.
|
||||
next_date = (current_date.replace(day=1, year=year, month=month) + timedelta(days=31)).replace(day=1) - timedelta(days=1)
|
||||
|
||||
elif pattern.type == 'yearly':
|
||||
return current_date.replace(year=current_date.year + pattern.interval)
|
||||
elif pattern_type == 'yearly':
|
||||
try:
|
||||
next_date = current_date.replace(year=current_date.year + pattern.interval)
|
||||
except ValueError:
|
||||
# Leap-year edge-case; fallback to Feb-28 if Feb-29 does not exist in the target year.
|
||||
next_date = current_date.replace(year=current_date.year + pattern.interval, day=28)
|
||||
|
||||
# Stop recurrence if beyond end_date
|
||||
if pattern.end_date and next_date and next_date > pattern.end_date:
|
||||
return None
|
||||
|
||||
return next_date
|
203
be/app/main.py
203
be/app/main.py
@ -1,60 +1,44 @@
|
||||
# app/main.py
|
||||
import logging
|
||||
import uvicorn
|
||||
from fastapi import FastAPI, HTTPException, Depends, status, Request
|
||||
from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from starlette.middleware.sessions import SessionMiddleware
|
||||
import sentry_sdk
|
||||
from sentry_sdk.integrations.fastapi import FastApiIntegration
|
||||
from fastapi_users.authentication import JWTStrategy
|
||||
from pydantic import BaseModel
|
||||
from jose import jwt, JWTError
|
||||
from sqlalchemy.ext.asyncio import AsyncEngine
|
||||
from alembic.config import Config
|
||||
from alembic import command
|
||||
import os
|
||||
import sys
|
||||
|
||||
from app.api.api_router import api_router
|
||||
from app.config import settings
|
||||
from app.core.api_config import API_METADATA, API_TAGS
|
||||
from app.auth import fastapi_users, auth_backend, get_refresh_jwt_strategy, get_jwt_strategy
|
||||
from app.models import User
|
||||
from app.api.auth.oauth import router as oauth_router
|
||||
from app.auth import fastapi_users, auth_backend
|
||||
from app.schemas.user import UserPublic, UserCreate, UserUpdate
|
||||
from app.core.scheduler import init_scheduler, shutdown_scheduler
|
||||
from app.database import get_session
|
||||
from sqlalchemy import select
|
||||
from app.core.middleware import RequestContextMiddleware
|
||||
from app.core.logging_utils import PiiRedactionFilter
|
||||
from app.core.error_handlers import sqlalchemy_exception_handler, generic_exception_handler
|
||||
from app.core.rate_limiter import RateLimitMiddleware
|
||||
|
||||
# Response model for refresh endpoint
|
||||
class RefreshResponse(BaseModel):
|
||||
access_token: str
|
||||
refresh_token: str
|
||||
token_type: str = "bearer"
|
||||
|
||||
# Initialize Sentry only if DSN is provided
|
||||
if settings.SENTRY_DSN:
|
||||
sentry_sdk.init(
|
||||
dsn=settings.SENTRY_DSN,
|
||||
integrations=[
|
||||
FastApiIntegration(),
|
||||
],
|
||||
# Adjust traces_sample_rate for production
|
||||
traces_sample_rate=0.1 if settings.is_production else 1.0,
|
||||
environment=settings.ENVIRONMENT,
|
||||
# Enable PII data only in development
|
||||
send_default_pii=not settings.is_production
|
||||
)
|
||||
|
||||
# --- Logging Setup ---
|
||||
logging.basicConfig(
|
||||
level=getattr(logging, settings.LOG_LEVEL),
|
||||
format=settings.LOG_FORMAT
|
||||
)
|
||||
|
||||
# Attach PII redaction filter to root logger
|
||||
root_logger = logging.getLogger()
|
||||
root_logger.addFilter(PiiRedactionFilter())
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# --- FastAPI App Instance ---
|
||||
# Create API metadata with environment-dependent settings
|
||||
api_metadata = {
|
||||
**API_METADATA,
|
||||
"docs_url": settings.docs_url,
|
||||
@ -67,167 +51,53 @@ app = FastAPI(
|
||||
openapi_tags=API_TAGS
|
||||
)
|
||||
|
||||
# Add session middleware for OAuth
|
||||
app.add_middleware(
|
||||
SessionMiddleware,
|
||||
secret_key=settings.SESSION_SECRET_KEY
|
||||
)
|
||||
|
||||
# --- CORS Middleware ---
|
||||
# Structured logging & request tracing
|
||||
app.add_middleware(RequestContextMiddleware)
|
||||
app.add_middleware(RateLimitMiddleware)
|
||||
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=settings.cors_origins_list,
|
||||
allow_credentials=True,
|
||||
allow_origins=(settings.cors_origins_list if not settings.is_production else [settings.FRONTEND_URL]),
|
||||
# Credentials (cookies) are not required because we use JWTs in Authorization headers.
|
||||
allow_credentials=False,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
expose_headers=["*"]
|
||||
)
|
||||
# --- End CORS Middleware ---
|
||||
|
||||
# Refresh token endpoint
|
||||
@app.post("/auth/jwt/refresh", response_model=RefreshResponse, tags=["auth"])
|
||||
async def refresh_jwt_token(
|
||||
request: Request,
|
||||
refresh_strategy: JWTStrategy = Depends(get_refresh_jwt_strategy),
|
||||
access_strategy: JWTStrategy = Depends(get_jwt_strategy),
|
||||
):
|
||||
"""
|
||||
Refresh access token using a valid refresh token.
|
||||
Send refresh token in Authorization header: Bearer <refresh_token>
|
||||
"""
|
||||
try:
|
||||
# Get refresh token from Authorization header
|
||||
authorization = request.headers.get("Authorization")
|
||||
if not authorization or not authorization.startswith("Bearer "):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Refresh token missing or invalid format",
|
||||
headers={"WWW-Authenticate": "Bearer"},
|
||||
)
|
||||
# Register exception handlers BEFORE adding middleware/router
|
||||
app.add_exception_handler(Exception, generic_exception_handler)
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
app.add_exception_handler(SQLAlchemyError, sqlalchemy_exception_handler)
|
||||
|
||||
refresh_token = authorization.split(" ")[1]
|
||||
|
||||
# Validate refresh token and get user data
|
||||
try:
|
||||
# Decode the refresh token to get the user identifier
|
||||
payload = jwt.decode(refresh_token, settings.SECRET_KEY, algorithms=["HS256"])
|
||||
user_id = payload.get("sub")
|
||||
if user_id is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Invalid refresh token",
|
||||
)
|
||||
except JWTError:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Invalid refresh token",
|
||||
)
|
||||
|
||||
# Get user from database
|
||||
async with get_session() as session:
|
||||
result = await session.execute(select(User).where(User.id == int(user_id)))
|
||||
user = result.scalar_one_or_none()
|
||||
|
||||
if not user or not user.is_active:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="User not found or inactive",
|
||||
)
|
||||
|
||||
# Generate new tokens
|
||||
new_access_token = await access_strategy.write_token(user)
|
||||
new_refresh_token = await refresh_strategy.write_token(user)
|
||||
|
||||
return RefreshResponse(
|
||||
access_token=new_access_token,
|
||||
refresh_token=new_refresh_token,
|
||||
token_type="bearer"
|
||||
)
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Error refreshing token: {e}")
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Invalid refresh token"
|
||||
)
|
||||
|
||||
# --- Include API Routers ---
|
||||
# Include OAuth routes first (no auth required)
|
||||
app.include_router(oauth_router, prefix="/auth", tags=["auth"])
|
||||
|
||||
# Include FastAPI-Users routes
|
||||
app.include_router(
|
||||
fastapi_users.get_auth_router(auth_backend),
|
||||
prefix="/auth/jwt",
|
||||
tags=["auth"],
|
||||
)
|
||||
app.include_router(
|
||||
fastapi_users.get_register_router(UserPublic, UserCreate),
|
||||
prefix="/auth",
|
||||
tags=["auth"],
|
||||
)
|
||||
app.include_router(
|
||||
fastapi_users.get_reset_password_router(),
|
||||
prefix="/auth",
|
||||
tags=["auth"],
|
||||
)
|
||||
app.include_router(
|
||||
fastapi_users.get_verify_router(UserPublic),
|
||||
prefix="/auth",
|
||||
tags=["auth"],
|
||||
)
|
||||
app.include_router(
|
||||
fastapi_users.get_users_router(UserPublic, UserUpdate),
|
||||
prefix="/users",
|
||||
tags=["users"],
|
||||
)
|
||||
|
||||
# Include your API router
|
||||
app.include_router(api_router, prefix=settings.API_PREFIX)
|
||||
# --- End Include API Routers ---
|
||||
|
||||
# Health check endpoint
|
||||
@app.get("/health", tags=["Health"])
|
||||
async def health_check():
|
||||
"""
|
||||
Health check endpoint for load balancers and monitoring.
|
||||
"""
|
||||
return {
|
||||
"status": settings.HEALTH_STATUS_OK,
|
||||
"environment": settings.ENVIRONMENT,
|
||||
"version": settings.API_VERSION
|
||||
}
|
||||
"""Minimal health check endpoint that avoids leaking build metadata."""
|
||||
return {"status": settings.HEALTH_STATUS_OK}
|
||||
|
||||
# --- Root Endpoint (Optional - outside the main API structure) ---
|
||||
@app.get("/", tags=["Root"])
|
||||
async def read_root():
|
||||
"""
|
||||
Provides a simple welcome message at the root path.
|
||||
Useful for basic reachability checks.
|
||||
"""
|
||||
"""Public root endpoint with minimal information."""
|
||||
logger.info("Root endpoint '/' accessed.")
|
||||
return {
|
||||
"message": settings.ROOT_MESSAGE,
|
||||
"environment": settings.ENVIRONMENT,
|
||||
"version": settings.API_VERSION
|
||||
}
|
||||
# --- End Root Endpoint ---
|
||||
return {"message": settings.ROOT_MESSAGE}
|
||||
|
||||
async def run_migrations():
|
||||
"""Run database migrations."""
|
||||
try:
|
||||
logger.info("Running database migrations...")
|
||||
# Get the absolute path to the alembic directory
|
||||
base_path = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
alembic_path = os.path.join(base_path, 'alembic')
|
||||
|
||||
# Add alembic directory to Python path
|
||||
if alembic_path not in sys.path:
|
||||
sys.path.insert(0, alembic_path)
|
||||
|
||||
# Import and run migrations
|
||||
from migrations import run_migrations as run_db_migrations
|
||||
await run_db_migrations()
|
||||
|
||||
@ -240,11 +110,7 @@ async def run_migrations():
|
||||
async def startup_event():
|
||||
"""Initialize services on startup."""
|
||||
logger.info(f"Application startup in {settings.ENVIRONMENT} environment...")
|
||||
|
||||
# Run database migrations
|
||||
# await run_migrations()
|
||||
|
||||
# Initialize scheduler
|
||||
init_scheduler()
|
||||
logger.info("Application startup complete.")
|
||||
|
||||
@ -252,15 +118,12 @@ async def startup_event():
|
||||
async def shutdown_event():
|
||||
"""Cleanup services on shutdown."""
|
||||
logger.info("Application shutdown: Disconnecting from database...")
|
||||
# await database.engine.dispose() # Close connection pool
|
||||
shutdown_scheduler()
|
||||
# Close Redis connection pool to avoid leaking file descriptors.
|
||||
try:
|
||||
from app.core.redis import redis_pool
|
||||
await redis_pool.aclose()
|
||||
logger.info("Redis pool closed.")
|
||||
except Exception as e:
|
||||
logger.warning(f"Error closing Redis pool: {e}")
|
||||
logger.info("Application shutdown complete.")
|
||||
# --- End Events ---
|
||||
|
||||
|
||||
# --- Direct Run (for simple local testing if needed) ---
|
||||
# It's better to use `uvicorn app.main:app --reload` from the terminal
|
||||
# if __name__ == "__main__":
|
||||
# logger.info("Starting Uvicorn server directly from main.py")
|
||||
# uvicorn.run(app, host="0.0.0.0", port=8000)
|
||||
# ------------------------------------------------------
|
303
be/app/models.py
303
be/app/models.py
@ -1,4 +1,3 @@
|
||||
# app/models.py
|
||||
import enum
|
||||
import secrets
|
||||
from datetime import datetime, timedelta, timezone
|
||||
@ -14,19 +13,30 @@ from sqlalchemy import (
|
||||
UniqueConstraint,
|
||||
Index,
|
||||
DDL,
|
||||
event,
|
||||
delete,
|
||||
func,
|
||||
text as sa_text,
|
||||
Text, # <-- Add Text for description
|
||||
Numeric, # <-- Add Numeric for price
|
||||
Text,
|
||||
Numeric,
|
||||
CheckConstraint,
|
||||
Date # Added Date for Chore model
|
||||
Date
|
||||
)
|
||||
from sqlalchemy.orm import relationship, backref
|
||||
from sqlalchemy.orm import relationship, declared_attr
|
||||
from sqlalchemy.dialects.postgresql import JSONB
|
||||
from sqlalchemy.ext.declarative import declared_attr
|
||||
|
||||
from .database import Base
|
||||
|
||||
class SoftDeleteMixin:
|
||||
deleted_at = Column(DateTime(timezone=True), nullable=True, index=True)
|
||||
is_deleted = Column(Boolean, default=False, nullable=False, index=True)
|
||||
|
||||
@declared_attr
|
||||
def __mapper_args__(cls):
|
||||
return {
|
||||
'polymorphic_identity': cls.__name__.lower(),
|
||||
'passive_deletes': True
|
||||
}
|
||||
|
||||
# --- Enums ---
|
||||
class UserRoleEnum(enum.Enum):
|
||||
owner = "owner"
|
||||
@ -71,8 +81,21 @@ class ChoreTypeEnum(enum.Enum):
|
||||
personal = "personal"
|
||||
group = "group"
|
||||
|
||||
class ChoreHistoryEventTypeEnum(str, enum.Enum):
|
||||
CREATED = "created"
|
||||
UPDATED = "updated"
|
||||
DELETED = "deleted"
|
||||
COMPLETED = "completed"
|
||||
REOPENED = "reopened"
|
||||
ASSIGNED = "assigned"
|
||||
UNASSIGNED = "unassigned"
|
||||
REASSIGNED = "reassigned"
|
||||
SCHEDULE_GENERATED = "schedule_generated"
|
||||
DUE_DATE_CHANGED = "due_date_changed"
|
||||
DETAILS_CHANGED = "details_changed"
|
||||
|
||||
# --- User Model ---
|
||||
class User(Base):
|
||||
class User(Base, SoftDeleteMixin):
|
||||
__tablename__ = "users"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
@ -82,78 +105,63 @@ class User(Base):
|
||||
is_active = Column(Boolean, default=True, nullable=False)
|
||||
is_superuser = Column(Boolean, default=False, nullable=False)
|
||||
is_verified = Column(Boolean, default=False, nullable=False)
|
||||
is_guest = Column(Boolean, default=False, nullable=False)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
|
||||
# --- Relationships ---
|
||||
created_groups = relationship("Group", back_populates="creator")
|
||||
group_associations = relationship("UserGroup", back_populates="user", cascade="all, delete-orphan")
|
||||
group_associations = relationship("UserGroup", back_populates="user")
|
||||
created_invites = relationship("Invite", back_populates="creator")
|
||||
|
||||
# --- NEW Relationships for Lists/Items ---
|
||||
created_lists = relationship("List", foreign_keys="List.created_by_id", back_populates="creator") # Link List.created_by_id -> User
|
||||
added_items = relationship("Item", foreign_keys="Item.added_by_id", back_populates="added_by_user") # Link Item.added_by_id -> User
|
||||
completed_items = relationship("Item", foreign_keys="Item.completed_by_id", back_populates="completed_by_user") # Link Item.completed_by_id -> User
|
||||
# --- End NEW Relationships ---
|
||||
|
||||
# --- Relationships for Cost Splitting ---
|
||||
expenses_paid = relationship("Expense", foreign_keys="Expense.paid_by_user_id", back_populates="paid_by_user", cascade="all, delete-orphan")
|
||||
expenses_created = relationship("Expense", foreign_keys="Expense.created_by_user_id", back_populates="created_by_user", cascade="all, delete-orphan")
|
||||
expense_splits = relationship("ExpenseSplit", foreign_keys="ExpenseSplit.user_id", back_populates="user", cascade="all, delete-orphan")
|
||||
settlements_made = relationship("Settlement", foreign_keys="Settlement.paid_by_user_id", back_populates="payer", cascade="all, delete-orphan")
|
||||
settlements_received = relationship("Settlement", foreign_keys="Settlement.paid_to_user_id", back_populates="payee", cascade="all, delete-orphan")
|
||||
settlements_created = relationship("Settlement", foreign_keys="Settlement.created_by_user_id", back_populates="created_by_user", cascade="all, delete-orphan")
|
||||
# --- End Relationships for Cost Splitting ---
|
||||
|
||||
# --- Relationships for Chores ---
|
||||
created_lists = relationship("List", foreign_keys="List.created_by_id", back_populates="creator")
|
||||
added_items = relationship("Item", foreign_keys="Item.added_by_id", back_populates="added_by_user")
|
||||
completed_items = relationship("Item", foreign_keys="Item.completed_by_id", back_populates="completed_by_user")
|
||||
expenses_paid = relationship("Expense", foreign_keys="Expense.paid_by_user_id", back_populates="paid_by_user")
|
||||
expenses_created = relationship("Expense", foreign_keys="Expense.created_by_user_id", back_populates="created_by_user")
|
||||
expense_splits = relationship("ExpenseSplit", foreign_keys="ExpenseSplit.user_id", back_populates="user")
|
||||
settlements_made = relationship("Settlement", foreign_keys="Settlement.paid_by_user_id", back_populates="payer")
|
||||
settlements_received = relationship("Settlement", foreign_keys="Settlement.paid_to_user_id", back_populates="payee")
|
||||
settlements_created = relationship("Settlement", foreign_keys="Settlement.created_by_user_id", back_populates="created_by_user")
|
||||
created_chores = relationship("Chore", foreign_keys="[Chore.created_by_id]", back_populates="creator")
|
||||
assigned_chores = relationship("ChoreAssignment", back_populates="assigned_user", cascade="all, delete-orphan")
|
||||
# --- End Relationships for Chores ---
|
||||
assigned_chores = relationship("ChoreAssignment", back_populates="assigned_user")
|
||||
chore_history_entries = relationship("ChoreHistory", back_populates="changed_by_user")
|
||||
assignment_history_entries = relationship("ChoreAssignmentHistory", back_populates="changed_by_user")
|
||||
financial_audit_logs = relationship("FinancialAuditLog", back_populates="user")
|
||||
time_entries = relationship("TimeEntry", back_populates="user")
|
||||
categories = relationship("Category", back_populates="user")
|
||||
|
||||
|
||||
# --- Group Model ---
|
||||
class Group(Base):
|
||||
class Group(Base, SoftDeleteMixin):
|
||||
__tablename__ = "groups"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
name = Column(String, index=True, nullable=False)
|
||||
created_by_id = Column(Integer, ForeignKey("users.id"), nullable=False)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
version = Column(Integer, nullable=False, default=1, server_default='1')
|
||||
|
||||
# --- Relationships ---
|
||||
creator = relationship("User", back_populates="created_groups")
|
||||
member_associations = relationship("UserGroup", back_populates="group", cascade="all, delete-orphan")
|
||||
invites = relationship("Invite", back_populates="group", cascade="all, delete-orphan")
|
||||
member_associations = relationship("UserGroup", back_populates="group")
|
||||
invites = relationship("Invite", back_populates="group")
|
||||
|
||||
# --- NEW Relationship for Lists ---
|
||||
lists = relationship("List", back_populates="group", cascade="all, delete-orphan") # Link List.group_id -> Group
|
||||
# --- End NEW Relationship ---
|
||||
lists = relationship("List", back_populates="group")
|
||||
expenses = relationship("Expense", foreign_keys="Expense.group_id", back_populates="group")
|
||||
settlements = relationship("Settlement", foreign_keys="Settlement.group_id", back_populates="group")
|
||||
chores = relationship("Chore", back_populates="group")
|
||||
chore_history = relationship("ChoreHistory", back_populates="group")
|
||||
|
||||
# --- Relationships for Cost Splitting ---
|
||||
expenses = relationship("Expense", foreign_keys="Expense.group_id", back_populates="group", cascade="all, delete-orphan")
|
||||
settlements = relationship("Settlement", foreign_keys="Settlement.group_id", back_populates="group", cascade="all, delete-orphan")
|
||||
# --- End Relationships for Cost Splitting ---
|
||||
|
||||
# --- Relationship for Chores ---
|
||||
chores = relationship("Chore", back_populates="group", cascade="all, delete-orphan")
|
||||
# --- End Relationship for Chores ---
|
||||
|
||||
|
||||
# --- UserGroup Association Model ---
|
||||
class UserGroup(Base):
|
||||
__tablename__ = "user_groups"
|
||||
__table_args__ = (UniqueConstraint('user_id', 'group_id', name='uq_user_group'),)
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
user_id = Column(Integer, ForeignKey("users.id", ondelete="CASCADE"), nullable=False)
|
||||
group_id = Column(Integer, ForeignKey("groups.id", ondelete="CASCADE"), nullable=False)
|
||||
user_id = Column(Integer, ForeignKey("users.id"), nullable=False)
|
||||
group_id = Column(Integer, ForeignKey("groups.id"), nullable=False)
|
||||
role = Column(SAEnum(UserRoleEnum, name="userroleenum", create_type=True), nullable=False, default=UserRoleEnum.member)
|
||||
joined_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
|
||||
user = relationship("User", back_populates="group_associations")
|
||||
group = relationship("Group", back_populates="member_associations")
|
||||
|
||||
|
||||
# --- Invite Model ---
|
||||
class Invite(Base):
|
||||
__tablename__ = "invites"
|
||||
__table_args__ = (
|
||||
@ -162,7 +170,7 @@ class Invite(Base):
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
code = Column(String, unique=False, index=True, nullable=False, default=lambda: secrets.token_urlsafe(16))
|
||||
group_id = Column(Integer, ForeignKey("groups.id", ondelete="CASCADE"), nullable=False)
|
||||
group_id = Column(Integer, ForeignKey("groups.id"), nullable=False)
|
||||
created_by_id = Column(Integer, ForeignKey("users.id"), nullable=False)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
expires_at = Column(DateTime(timezone=True), nullable=False, default=lambda: datetime.now(timezone.utc) + timedelta(days=7))
|
||||
@ -172,69 +180,58 @@ class Invite(Base):
|
||||
creator = relationship("User", back_populates="created_invites")
|
||||
|
||||
|
||||
# === NEW: List Model ===
|
||||
class List(Base):
|
||||
class List(Base, SoftDeleteMixin):
|
||||
__tablename__ = "lists"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
name = Column(String, index=True, nullable=False)
|
||||
description = Column(Text, nullable=True)
|
||||
created_by_id = Column(Integer, ForeignKey("users.id"), nullable=False) # Who created this list
|
||||
group_id = Column(Integer, ForeignKey("groups.id"), nullable=True) # Which group it belongs to (NULL if personal)
|
||||
created_by_id = Column(Integer, ForeignKey("users.id"), nullable=False)
|
||||
group_id = Column(Integer, ForeignKey("groups.id"), nullable=True)
|
||||
is_complete = Column(Boolean, default=False, nullable=False)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
version = Column(Integer, nullable=False, default=1, server_default='1')
|
||||
archived_at = Column(DateTime(timezone=True), nullable=True, index=True)
|
||||
|
||||
# --- Relationships ---
|
||||
creator = relationship("User", back_populates="created_lists") # Link to User.created_lists
|
||||
group = relationship("Group", back_populates="lists") # Link to Group.lists
|
||||
creator = relationship("User", back_populates="created_lists")
|
||||
group = relationship("Group", back_populates="lists")
|
||||
items = relationship(
|
||||
"Item",
|
||||
back_populates="list",
|
||||
cascade="all, delete-orphan",
|
||||
order_by="Item.position.asc(), Item.created_at.asc()" # Default order by position, then creation
|
||||
order_by="Item.position.asc(), Item.created_at.asc()"
|
||||
)
|
||||
|
||||
# --- Relationships for Cost Splitting ---
|
||||
expenses = relationship("Expense", foreign_keys="Expense.list_id", back_populates="list", cascade="all, delete-orphan")
|
||||
# --- End Relationships for Cost Splitting ---
|
||||
expenses = relationship("Expense", foreign_keys="Expense.list_id", back_populates="list")
|
||||
|
||||
|
||||
# === NEW: Item Model ===
|
||||
class Item(Base):
|
||||
class Item(Base, SoftDeleteMixin):
|
||||
__tablename__ = "items"
|
||||
__table_args__ = (
|
||||
Index('ix_items_list_id_position', 'list_id', 'position'),
|
||||
)
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
list_id = Column(Integer, ForeignKey("lists.id", ondelete="CASCADE"), nullable=False) # Belongs to which list
|
||||
list_id = Column(Integer, ForeignKey("lists.id"), nullable=False)
|
||||
name = Column(String, index=True, nullable=False)
|
||||
quantity = Column(String, nullable=True) # Flexible quantity (e.g., "1", "2 lbs", "a bunch")
|
||||
quantity = Column(String, nullable=True)
|
||||
is_complete = Column(Boolean, default=False, nullable=False)
|
||||
price = Column(Numeric(10, 2), nullable=True) # For cost splitting later (e.g., 12345678.99)
|
||||
position = Column(Integer, nullable=False, server_default='0') # For ordering
|
||||
added_by_id = Column(Integer, ForeignKey("users.id"), nullable=False) # Who added this item
|
||||
completed_by_id = Column(Integer, ForeignKey("users.id"), nullable=True) # Who marked it complete
|
||||
price = Column(Numeric(10, 2), nullable=True)
|
||||
position = Column(Integer, nullable=False, server_default='0')
|
||||
category_id = Column(Integer, ForeignKey('categories.id'), nullable=True)
|
||||
added_by_id = Column(Integer, ForeignKey("users.id"), nullable=False)
|
||||
completed_by_id = Column(Integer, ForeignKey("users.id"), nullable=True)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
version = Column(Integer, nullable=False, default=1, server_default='1')
|
||||
|
||||
# --- Relationships ---
|
||||
list = relationship("List", back_populates="items") # Link to List.items
|
||||
added_by_user = relationship("User", foreign_keys=[added_by_id], back_populates="added_items") # Link to User.added_items
|
||||
completed_by_user = relationship("User", foreign_keys=[completed_by_id], back_populates="completed_items") # Link to User.completed_items
|
||||
list = relationship("List", back_populates="items")
|
||||
added_by_user = relationship("User", foreign_keys=[added_by_id], back_populates="added_items")
|
||||
completed_by_user = relationship("User", foreign_keys=[completed_by_id], back_populates="completed_items")
|
||||
expenses = relationship("Expense", back_populates="item")
|
||||
category = relationship("Category", back_populates="items")
|
||||
|
||||
# --- Relationships for Cost Splitting ---
|
||||
# If an item directly results in an expense, or an expense can be tied to an item.
|
||||
expenses = relationship("Expense", back_populates="item") # An item might have multiple associated expenses
|
||||
# --- End Relationships for Cost Splitting ---
|
||||
|
||||
|
||||
# === NEW Models for Advanced Cost Splitting ===
|
||||
|
||||
class Expense(Base):
|
||||
class Expense(Base, SoftDeleteMixin):
|
||||
__tablename__ = "expenses"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
@ -244,7 +241,6 @@ class Expense(Base):
|
||||
expense_date = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
split_type = Column(SAEnum(SplitTypeEnum, name="splittypeenum", create_type=True), nullable=False)
|
||||
|
||||
# Foreign Keys
|
||||
list_id = Column(Integer, ForeignKey("lists.id"), nullable=True, index=True)
|
||||
group_id = Column(Integer, ForeignKey("groups.id"), nullable=True, index=True)
|
||||
item_id = Column(Integer, ForeignKey("items.id"), nullable=True)
|
||||
@ -255,17 +251,15 @@ class Expense(Base):
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
version = Column(Integer, nullable=False, default=1, server_default='1')
|
||||
|
||||
# Relationships
|
||||
paid_by_user = relationship("User", foreign_keys=[paid_by_user_id], back_populates="expenses_paid")
|
||||
created_by_user = relationship("User", foreign_keys=[created_by_user_id], back_populates="expenses_created")
|
||||
list = relationship("List", foreign_keys=[list_id], back_populates="expenses")
|
||||
group = relationship("Group", foreign_keys=[group_id], back_populates="expenses")
|
||||
item = relationship("Item", foreign_keys=[item_id], back_populates="expenses")
|
||||
splits = relationship("ExpenseSplit", back_populates="expense", cascade="all, delete-orphan")
|
||||
splits = relationship("ExpenseSplit", back_populates="expense")
|
||||
parent_expense = relationship("Expense", remote_side=[id], back_populates="child_expenses")
|
||||
child_expenses = relationship("Expense", back_populates="parent_expense")
|
||||
overall_settlement_status = Column(SAEnum(ExpenseOverallStatusEnum, name="expenseoverallstatusenum", create_type=True), nullable=False, server_default=ExpenseOverallStatusEnum.unpaid.value, default=ExpenseOverallStatusEnum.unpaid)
|
||||
# --- Recurrence fields ---
|
||||
is_recurring = Column(Boolean, default=False, nullable=False)
|
||||
recurrence_pattern_id = Column(Integer, ForeignKey("recurrence_patterns.id"), nullable=True)
|
||||
recurrence_pattern = relationship("RecurrencePattern", back_populates="expenses", uselist=False) # One-to-one
|
||||
@ -274,11 +268,10 @@ class Expense(Base):
|
||||
last_occurrence = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
__table_args__ = (
|
||||
# Ensure at least one context is provided
|
||||
CheckConstraint('(item_id IS NOT NULL) OR (list_id IS NOT NULL) OR (group_id IS NOT NULL)', name='chk_expense_context'),
|
||||
CheckConstraint('group_id IS NOT NULL OR list_id IS NOT NULL', name='ck_expense_group_or_list'),
|
||||
)
|
||||
|
||||
class ExpenseSplit(Base):
|
||||
class ExpenseSplit(Base, SoftDeleteMixin):
|
||||
__tablename__ = "expense_splits"
|
||||
__table_args__ = (
|
||||
UniqueConstraint('expense_id', 'user_id', name='uq_expense_user_split'),
|
||||
@ -286,7 +279,7 @@ class ExpenseSplit(Base):
|
||||
)
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
expense_id = Column(Integer, ForeignKey("expenses.id", ondelete="CASCADE"), nullable=False)
|
||||
expense_id = Column(Integer, ForeignKey("expenses.id"), nullable=False)
|
||||
user_id = Column(Integer, ForeignKey("users.id"), nullable=False)
|
||||
|
||||
owed_amount = Column(Numeric(10, 2), nullable=False)
|
||||
@ -296,14 +289,12 @@ class ExpenseSplit(Base):
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
|
||||
# Relationships
|
||||
expense = relationship("Expense", back_populates="splits")
|
||||
user = relationship("User", foreign_keys=[user_id], back_populates="expense_splits")
|
||||
settlement_activities = relationship("SettlementActivity", back_populates="split", cascade="all, delete-orphan")
|
||||
settlement_activities = relationship("SettlementActivity", back_populates="split")
|
||||
|
||||
# New fields for tracking payment status
|
||||
status = Column(SAEnum(ExpenseSplitStatusEnum, name="expensesplitstatusenum", create_type=True), nullable=False, server_default=ExpenseSplitStatusEnum.unpaid.value, default=ExpenseSplitStatusEnum.unpaid)
|
||||
paid_at = Column(DateTime(timezone=True), nullable=True) # Timestamp when the split was fully paid
|
||||
paid_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
class Settlement(Base):
|
||||
__tablename__ = "settlements"
|
||||
@ -321,33 +312,28 @@ class Settlement(Base):
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
version = Column(Integer, nullable=False, default=1, server_default='1')
|
||||
|
||||
# Relationships
|
||||
group = relationship("Group", foreign_keys=[group_id], back_populates="settlements")
|
||||
payer = relationship("User", foreign_keys=[paid_by_user_id], back_populates="settlements_made")
|
||||
payee = relationship("User", foreign_keys=[paid_to_user_id], back_populates="settlements_received")
|
||||
created_by_user = relationship("User", foreign_keys=[created_by_user_id], back_populates="settlements_created")
|
||||
|
||||
__table_args__ = (
|
||||
# Ensure payer and payee are different users
|
||||
CheckConstraint('paid_by_user_id != paid_to_user_id', name='chk_settlement_different_users'),
|
||||
)
|
||||
|
||||
# Potential future: PaymentMethod model, etc.
|
||||
|
||||
class SettlementActivity(Base):
|
||||
__tablename__ = "settlement_activities"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
expense_split_id = Column(Integer, ForeignKey("expense_splits.id"), nullable=False, index=True)
|
||||
paid_by_user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True) # User who made this part of the payment
|
||||
paid_by_user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True)
|
||||
paid_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
amount_paid = Column(Numeric(10, 2), nullable=False)
|
||||
created_by_user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True) # User who recorded this activity
|
||||
created_by_user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True)
|
||||
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
|
||||
# --- Relationships ---
|
||||
split = relationship("ExpenseSplit", back_populates="settlement_activities")
|
||||
payer = relationship("User", foreign_keys=[paid_by_user_id], backref="made_settlement_activities")
|
||||
creator = relationship("User", foreign_keys=[created_by_user_id], backref="created_settlement_activities")
|
||||
@ -360,73 +346,138 @@ class SettlementActivity(Base):
|
||||
|
||||
|
||||
# --- Chore Model ---
|
||||
class Chore(Base):
|
||||
class Chore(Base, SoftDeleteMixin):
|
||||
__tablename__ = "chores"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
type = Column(SAEnum(ChoreTypeEnum, name="choretypeenum", create_type=True), nullable=False)
|
||||
group_id = Column(Integer, ForeignKey("groups.id", ondelete="CASCADE"), nullable=True, index=True)
|
||||
group_id = Column(Integer, ForeignKey("groups.id"), nullable=True, index=True)
|
||||
name = Column(String, nullable=False, index=True)
|
||||
description = Column(Text, nullable=True)
|
||||
created_by_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True)
|
||||
parent_chore_id = Column(Integer, ForeignKey('chores.id'), nullable=True, index=True)
|
||||
|
||||
frequency = Column(SAEnum(ChoreFrequencyEnum, name="chorefrequencyenum", create_type=True), nullable=False)
|
||||
custom_interval_days = Column(Integer, nullable=True) # Only if frequency is 'custom'
|
||||
custom_interval_days = Column(Integer, nullable=True)
|
||||
|
||||
next_due_date = Column(Date, nullable=False) # Changed to Date
|
||||
next_due_date = Column(Date, nullable=False)
|
||||
last_completed_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
|
||||
# --- Relationships ---
|
||||
group = relationship("Group", back_populates="chores")
|
||||
creator = relationship("User", back_populates="created_chores")
|
||||
assignments = relationship("ChoreAssignment", back_populates="chore", cascade="all, delete-orphan")
|
||||
assignments = relationship("ChoreAssignment", back_populates="chore")
|
||||
history = relationship("ChoreHistory", back_populates="chore")
|
||||
parent_chore = relationship("Chore", remote_side=[id], back_populates="child_chores")
|
||||
child_chores = relationship("Chore", back_populates="parent_chore")
|
||||
|
||||
|
||||
# --- ChoreAssignment Model ---
|
||||
class ChoreAssignment(Base):
|
||||
class ChoreAssignment(Base, SoftDeleteMixin):
|
||||
__tablename__ = "chore_assignments"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
chore_id = Column(Integer, ForeignKey("chores.id", ondelete="CASCADE"), nullable=False, index=True)
|
||||
assigned_to_user_id = Column(Integer, ForeignKey("users.id", ondelete="CASCADE"), nullable=False, index=True)
|
||||
chore_id = Column(Integer, ForeignKey("chores.id"), nullable=False, index=True)
|
||||
assigned_to_user_id = Column(Integer, ForeignKey("users.id"), nullable=False, index=True)
|
||||
|
||||
due_date = Column(Date, nullable=False) # Specific due date for this instance, changed to Date
|
||||
due_date = Column(Date, nullable=False)
|
||||
is_complete = Column(Boolean, default=False, nullable=False)
|
||||
completed_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
|
||||
# --- Relationships ---
|
||||
chore = relationship("Chore", back_populates="assignments")
|
||||
assigned_user = relationship("User", back_populates="assigned_chores")
|
||||
history = relationship("ChoreAssignmentHistory", back_populates="assignment")
|
||||
time_entries = relationship("TimeEntry", back_populates="assignment")
|
||||
|
||||
|
||||
# === NEW: RecurrencePattern Model ===
|
||||
class RecurrencePattern(Base):
|
||||
class RecurrencePattern(Base, SoftDeleteMixin):
|
||||
__tablename__ = "recurrence_patterns"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
type = Column(SAEnum(RecurrenceTypeEnum, name="recurrencetypeenum", create_type=True), nullable=False)
|
||||
interval = Column(Integer, default=1, nullable=False) # e.g., every 1 day, every 2 weeks
|
||||
days_of_week = Column(String, nullable=True) # For weekly recurrences, e.g., "MON,TUE,FRI"
|
||||
# day_of_month = Column(Integer, nullable=True) # For monthly on a specific day
|
||||
# week_of_month = Column(Integer, nullable=True) # For monthly on a specific week (e.g., 2nd week)
|
||||
# month_of_year = Column(Integer, nullable=True) # For yearly recurrences
|
||||
interval = Column(Integer, default=1, nullable=False)
|
||||
days_of_week = Column(String, nullable=True)
|
||||
end_date = Column(DateTime(timezone=True), nullable=True)
|
||||
max_occurrences = Column(Integer, nullable=True)
|
||||
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False)
|
||||
|
||||
# Relationship back to Expenses that use this pattern (could be one-to-many if patterns are shared)
|
||||
# However, the current CRUD implies one RecurrencePattern per Expense if recurring.
|
||||
# If a pattern can be shared, this would be a one-to-many (RecurrencePattern to many Expenses).
|
||||
# For now, assuming one-to-one as implied by current Expense.recurrence_pattern relationship setup.
|
||||
expenses = relationship("Expense", back_populates="recurrence_pattern")
|
||||
|
||||
|
||||
# === END: RecurrencePattern Model ===
|
||||
|
||||
# === NEW: Chore History Models ===
|
||||
|
||||
class ChoreHistory(Base):
|
||||
__tablename__ = "chore_history"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
chore_id = Column(Integer, ForeignKey("chores.id"), nullable=True, index=True)
|
||||
group_id = Column(Integer, ForeignKey("groups.id"), nullable=True, index=True)
|
||||
event_type = Column(SAEnum(ChoreHistoryEventTypeEnum, name="chorehistoryeventtypeenum", create_type=True), nullable=False)
|
||||
event_data = Column(JSONB, nullable=True)
|
||||
changed_by_user_id = Column(Integer, ForeignKey("users.id"), nullable=True)
|
||||
timestamp = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
|
||||
chore = relationship("Chore", back_populates="history")
|
||||
group = relationship("Group", back_populates="chore_history")
|
||||
changed_by_user = relationship("User", back_populates="chore_history_entries")
|
||||
|
||||
class ChoreAssignmentHistory(Base):
|
||||
__tablename__ = "chore_assignment_history"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
assignment_id = Column(Integer, ForeignKey("chore_assignments.id"), nullable=False, index=True)
|
||||
event_type = Column(SAEnum(ChoreHistoryEventTypeEnum, name="chorehistoryeventtypeenum", create_type=True), nullable=False)
|
||||
event_data = Column(JSONB, nullable=True)
|
||||
changed_by_user_id = Column(Integer, ForeignKey("users.id"), nullable=True)
|
||||
timestamp = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
|
||||
assignment = relationship("ChoreAssignment", back_populates="history")
|
||||
changed_by_user = relationship("User", back_populates="assignment_history_entries")
|
||||
|
||||
# --- New Models from Roadmap ---
|
||||
|
||||
class FinancialAuditLog(Base):
|
||||
__tablename__ = 'financial_audit_log'
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
timestamp = Column(DateTime(timezone=True), server_default=func.now(), nullable=False)
|
||||
user_id = Column(Integer, ForeignKey('users.id'), nullable=True)
|
||||
action_type = Column(String, nullable=False, index=True)
|
||||
entity_type = Column(String, nullable=False)
|
||||
entity_id = Column(Integer, nullable=False)
|
||||
details = Column(JSONB, nullable=True)
|
||||
|
||||
user = relationship("User", back_populates="financial_audit_logs")
|
||||
|
||||
class Category(Base, SoftDeleteMixin):
|
||||
__tablename__ = 'categories'
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
name = Column(String, nullable=False, index=True)
|
||||
user_id = Column(Integer, ForeignKey('users.id'), nullable=True)
|
||||
group_id = Column(Integer, ForeignKey('groups.id'), nullable=True)
|
||||
|
||||
user = relationship("User", back_populates="categories")
|
||||
items = relationship("Item", back_populates="category")
|
||||
|
||||
__table_args__ = (UniqueConstraint('name', 'user_id', 'group_id', name='uq_category_scope'),)
|
||||
|
||||
class TimeEntry(Base, SoftDeleteMixin):
|
||||
__tablename__ = 'time_entries'
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
chore_assignment_id = Column(Integer, ForeignKey('chore_assignments.id'), nullable=False)
|
||||
user_id = Column(Integer, ForeignKey('users.id'), nullable=False)
|
||||
start_time = Column(DateTime(timezone=True), nullable=False)
|
||||
end_time = Column(DateTime(timezone=True), nullable=True)
|
||||
duration_seconds = Column(Integer, nullable=True)
|
||||
|
||||
assignment = relationship("ChoreAssignment", back_populates="time_entries")
|
||||
user = relationship("User", back_populates="time_entries")
|
||||
|
20
be/app/schemas/audit.py
Normal file
20
be/app/schemas/audit.py
Normal file
@ -0,0 +1,20 @@
|
||||
from pydantic import BaseModel
|
||||
from datetime import datetime
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
class FinancialAuditLogBase(BaseModel):
|
||||
action_type: str
|
||||
entity_type: str
|
||||
entity_id: int
|
||||
details: Optional[Dict[str, Any]] = None
|
||||
|
||||
class FinancialAuditLogCreate(FinancialAuditLogBase):
|
||||
user_id: Optional[int] = None
|
||||
|
||||
class FinancialAuditLogPublic(FinancialAuditLogBase):
|
||||
id: int
|
||||
timestamp: datetime
|
||||
user_id: Optional[int] = None
|
||||
|
||||
class Config:
|
||||
orm_mode = True
|
@ -1,13 +1,7 @@
|
||||
# app/schemas/auth.py
|
||||
from pydantic import BaseModel, EmailStr
|
||||
from pydantic import BaseModel
|
||||
from app.config import settings
|
||||
|
||||
class Token(BaseModel):
|
||||
access_token: str
|
||||
refresh_token: str # Added refresh token
|
||||
token_type: str = settings.TOKEN_TYPE # Use configured token type
|
||||
|
||||
# Optional: If you preferred not to use OAuth2PasswordRequestForm
|
||||
# class UserLogin(BaseModel):
|
||||
# email: EmailStr
|
||||
# password: str
|
||||
refresh_token: str
|
||||
token_type: str = settings.TOKEN_TYPE
|
19
be/app/schemas/category.py
Normal file
19
be/app/schemas/category.py
Normal file
@ -0,0 +1,19 @@
|
||||
from pydantic import BaseModel
|
||||
from typing import Optional
|
||||
|
||||
class CategoryBase(BaseModel):
|
||||
name: str
|
||||
|
||||
class CategoryCreate(CategoryBase):
|
||||
pass
|
||||
|
||||
class CategoryUpdate(CategoryBase):
|
||||
pass
|
||||
|
||||
class CategoryPublic(CategoryBase):
|
||||
id: int
|
||||
user_id: Optional[int] = None
|
||||
group_id: Optional[int] = None
|
||||
|
||||
class Config:
|
||||
orm_mode = True
|
@ -1,14 +1,32 @@
|
||||
from __future__ import annotations
|
||||
from datetime import date, datetime
|
||||
from typing import Optional, List
|
||||
from pydantic import BaseModel, ConfigDict, field_validator
|
||||
from typing import Optional, List, Any
|
||||
from pydantic import BaseModel, ConfigDict, field_validator, model_validator
|
||||
from ..models import ChoreFrequencyEnum, ChoreTypeEnum, ChoreHistoryEventTypeEnum
|
||||
from .user import UserPublic
|
||||
|
||||
class ChoreAssignmentPublic(BaseModel):
|
||||
pass
|
||||
|
||||
class ChoreHistoryPublic(BaseModel):
|
||||
id: int
|
||||
event_type: ChoreHistoryEventTypeEnum
|
||||
event_data: Optional[dict[str, Any]] = None
|
||||
changed_by_user: Optional[UserPublic] = None
|
||||
timestamp: datetime
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
class ChoreAssignmentHistoryPublic(BaseModel):
|
||||
id: int
|
||||
event_type: ChoreHistoryEventTypeEnum
|
||||
event_data: Optional[dict[str, Any]] = None
|
||||
changed_by_user: Optional[UserPublic] = None
|
||||
timestamp: datetime
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Assuming ChoreFrequencyEnum is imported from models
|
||||
# Adjust the import path if necessary based on your project structure.
|
||||
# e.g., from app.models import ChoreFrequencyEnum
|
||||
from ..models import ChoreFrequencyEnum, ChoreTypeEnum, User as UserModel # For UserPublic relation
|
||||
from .user import UserPublic # For embedding user information
|
||||
|
||||
# Chore Schemas
|
||||
class ChoreBase(BaseModel):
|
||||
name: str
|
||||
description: Optional[str] = None
|
||||
@ -17,36 +35,26 @@ class ChoreBase(BaseModel):
|
||||
next_due_date: date # For creation, this will be the initial due date
|
||||
type: ChoreTypeEnum
|
||||
|
||||
@field_validator('custom_interval_days', mode='before')
|
||||
@classmethod
|
||||
def check_custom_interval_days(cls, value, values):
|
||||
# Pydantic v2 uses `values.data` to get all fields
|
||||
# For older Pydantic, it might just be `values`
|
||||
# This is a simplified check; actual access might differ slightly
|
||||
# based on Pydantic version context within the validator.
|
||||
# The goal is to ensure custom_interval_days is present if frequency is 'custom'.
|
||||
# This validator might be more complex in a real Pydantic v2 setup.
|
||||
|
||||
# A more direct way if 'frequency' is already parsed into values.data:
|
||||
# freq = values.data.get('frequency')
|
||||
# For this example, we'll assume 'frequency' might not be in 'values.data' yet
|
||||
# if 'custom_interval_days' is validated 'before' 'frequency'.
|
||||
# A truly robust validator might need to be on the whole model or run 'after'.
|
||||
# For now, this is a placeholder for the logic.
|
||||
# Consider if this validation is better handled at the service/CRUD layer for complex cases.
|
||||
return value
|
||||
@model_validator(mode='after')
|
||||
def validate_custom_frequency(self):
|
||||
if self.frequency == ChoreFrequencyEnum.custom:
|
||||
if self.custom_interval_days is None or self.custom_interval_days <= 0:
|
||||
raise ValueError("custom_interval_days must be a positive integer when frequency is 'custom'")
|
||||
return self
|
||||
|
||||
class ChoreCreate(ChoreBase):
|
||||
group_id: Optional[int] = None
|
||||
parent_chore_id: Optional[int] = None
|
||||
assigned_to_user_id: Optional[int] = None
|
||||
|
||||
@field_validator('group_id')
|
||||
@classmethod
|
||||
def validate_group_id(cls, v, values):
|
||||
if values.data.get('type') == ChoreTypeEnum.group and v is None:
|
||||
@model_validator(mode='after')
|
||||
def validate_group_id_with_type(self):
|
||||
if self.type == ChoreTypeEnum.group and self.group_id is None:
|
||||
raise ValueError("group_id is required for group chores")
|
||||
if values.data.get('type') == ChoreTypeEnum.personal and v is not None:
|
||||
raise ValueError("group_id must be None for personal chores")
|
||||
return v
|
||||
if self.type == ChoreTypeEnum.personal and self.group_id is not None:
|
||||
# Automatically clear group_id for personal chores instead of raising an error
|
||||
self.group_id = None
|
||||
return self
|
||||
|
||||
class ChoreUpdate(BaseModel):
|
||||
name: Optional[str] = None
|
||||
@ -56,26 +64,30 @@ class ChoreUpdate(BaseModel):
|
||||
next_due_date: Optional[date] = None # Allow updating next_due_date directly if needed
|
||||
type: Optional[ChoreTypeEnum] = None
|
||||
group_id: Optional[int] = None
|
||||
parent_chore_id: Optional[int] = None # Allow moving a chore under a parent or removing association
|
||||
# last_completed_at should generally not be updated directly by user
|
||||
|
||||
@field_validator('group_id')
|
||||
@classmethod
|
||||
def validate_group_id(cls, v, values):
|
||||
if values.data.get('type') == ChoreTypeEnum.group and v is None:
|
||||
@model_validator(mode='after')
|
||||
def validate_group_id_with_type(self):
|
||||
if self.type == ChoreTypeEnum.group and self.group_id is None:
|
||||
raise ValueError("group_id is required for group chores")
|
||||
if values.data.get('type') == ChoreTypeEnum.personal and v is not None:
|
||||
raise ValueError("group_id must be None for personal chores")
|
||||
return v
|
||||
if self.type == ChoreTypeEnum.personal and self.group_id is not None:
|
||||
# Automatically clear group_id for personal chores instead of raising an error
|
||||
self.group_id = None
|
||||
return self
|
||||
|
||||
class ChorePublic(ChoreBase):
|
||||
id: int
|
||||
group_id: Optional[int] = None
|
||||
created_by_id: int
|
||||
last_completed_at: Optional[datetime] = None
|
||||
parent_chore_id: Optional[int] = None
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
creator: Optional[UserPublic] = None # Embed creator UserPublic schema
|
||||
# group: Optional[GroupPublic] = None # Embed GroupPublic schema if needed
|
||||
assignments: List[ChoreAssignmentPublic] = []
|
||||
history: List[ChoreHistoryPublic] = []
|
||||
child_chores: List[ChorePublic] = []
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
@ -92,6 +104,7 @@ class ChoreAssignmentUpdate(BaseModel):
|
||||
# Only completion status and perhaps due_date can be updated for an assignment
|
||||
is_complete: Optional[bool] = None
|
||||
due_date: Optional[date] = None # If rescheduling an existing assignment is allowed
|
||||
assigned_to_user_id: Optional[int] = None # For reassigning the chore
|
||||
|
||||
class ChoreAssignmentPublic(ChoreAssignmentBase):
|
||||
id: int
|
||||
@ -102,10 +115,11 @@ class ChoreAssignmentPublic(ChoreAssignmentBase):
|
||||
# Embed ChorePublic and UserPublic for richer responses
|
||||
chore: Optional[ChorePublic] = None
|
||||
assigned_user: Optional[UserPublic] = None
|
||||
history: List[ChoreAssignmentHistoryPublic] = []
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# To handle potential circular imports if ChorePublic needs GroupPublic and GroupPublic needs ChorePublic
|
||||
# We can update forward refs after all models are defined.
|
||||
# ChorePublic.model_rebuild() # If using Pydantic v2 and forward refs were used with strings
|
||||
# ChoreAssignmentPublic.model_rebuild()
|
||||
ChorePublic.model_rebuild()
|
||||
ChoreAssignmentPublic.model_rebuild()
|
||||
|
@ -4,10 +4,10 @@ from decimal import Decimal
|
||||
|
||||
class UserCostShare(BaseModel):
|
||||
user_id: int
|
||||
user_identifier: str # Name or email
|
||||
items_added_value: Decimal = Decimal("0.00") # Total value of items this user added
|
||||
amount_due: Decimal # The user's share of the total cost (for equal split, this is total_cost / num_users)
|
||||
balance: Decimal # items_added_value - amount_due
|
||||
user_identifier: str
|
||||
items_added_value: Decimal = Decimal("0.00")
|
||||
amount_due: Decimal
|
||||
balance: Decimal
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
@ -23,19 +23,19 @@ class ListCostSummary(BaseModel):
|
||||
|
||||
class UserBalanceDetail(BaseModel):
|
||||
user_id: int
|
||||
user_identifier: str # Name or email
|
||||
user_identifier: str
|
||||
total_paid_for_expenses: Decimal = Decimal("0.00")
|
||||
total_share_of_expenses: Decimal = Decimal("0.00")
|
||||
total_settlements_paid: Decimal = Decimal("0.00")
|
||||
total_settlements_received: Decimal = Decimal("0.00")
|
||||
net_balance: Decimal = Decimal("0.00") # (paid_for_expenses + settlements_received) - (share_of_expenses + settlements_paid)
|
||||
net_balance: Decimal = Decimal("0.00")
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
class SuggestedSettlement(BaseModel):
|
||||
from_user_id: int
|
||||
from_user_identifier: str # Name or email of payer
|
||||
from_user_identifier: str
|
||||
to_user_id: int
|
||||
to_user_identifier: str # Name or email of payee
|
||||
to_user_identifier: str
|
||||
amount: Decimal
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
@ -45,11 +45,5 @@ class GroupBalanceSummary(BaseModel):
|
||||
overall_total_expenses: Decimal = Decimal("0.00")
|
||||
overall_total_settlements: Decimal = Decimal("0.00")
|
||||
user_balances: List[UserBalanceDetail]
|
||||
# Optional: Could add a list of suggested settlements to zero out balances
|
||||
suggested_settlements: Optional[List[SuggestedSettlement]] = None
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# class SuggestedSettlement(BaseModel):
|
||||
# from_user_id: int
|
||||
# to_user_id: int
|
||||
# amount: Decimal
|
@ -1,40 +1,33 @@
|
||||
# app/schemas/expense.py
|
||||
from pydantic import BaseModel, ConfigDict, validator, Field
|
||||
from typing import List, Optional, Dict, Any
|
||||
from typing import List, Optional
|
||||
from decimal import Decimal
|
||||
from datetime import datetime
|
||||
from app.models import SplitTypeEnum, ExpenseSplitStatusEnum, ExpenseOverallStatusEnum
|
||||
from app.schemas.user import UserPublic
|
||||
from app.schemas.settlement_activity import SettlementActivityPublic
|
||||
from app.schemas.recurrence import RecurrencePatternCreate, RecurrencePatternPublic
|
||||
|
||||
# Assuming SplitTypeEnum is accessible here, e.g., from app.models or app.core.enums
|
||||
# For now, let's redefine it or import it if models.py is parsable by Pydantic directly
|
||||
# If it's from app.models, you might need to make app.models.SplitTypeEnum Pydantic-compatible or map it.
|
||||
# For simplicity during schema definition, I'll redefine a string enum here.
|
||||
# In a real setup, ensure this aligns with the SQLAlchemy enum in models.py.
|
||||
from app.models import SplitTypeEnum, ExpenseSplitStatusEnum, ExpenseOverallStatusEnum # Try importing directly
|
||||
from app.schemas.user import UserPublic # For user details in responses
|
||||
from app.schemas.settlement_activity import SettlementActivityPublic # For settlement activities
|
||||
|
||||
# --- ExpenseSplit Schemas ---
|
||||
class ExpenseSplitBase(BaseModel):
|
||||
user_id: int
|
||||
owed_amount: Decimal
|
||||
owed_amount: Optional[Decimal] = None
|
||||
share_percentage: Optional[Decimal] = None
|
||||
share_units: Optional[int] = None
|
||||
# Note: Status is handled by the backend, not in create/update payloads
|
||||
|
||||
class ExpenseSplitCreate(ExpenseSplitBase):
|
||||
pass # All fields from base are needed for creation
|
||||
pass
|
||||
|
||||
class ExpenseSplitPublic(ExpenseSplitBase):
|
||||
id: int
|
||||
expense_id: int
|
||||
user: Optional[UserPublic] = None # If we want to nest user details
|
||||
status: ExpenseSplitStatusEnum
|
||||
user: Optional[UserPublic] = None
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
status: ExpenseSplitStatusEnum # New field
|
||||
paid_at: Optional[datetime] = None # New field
|
||||
settlement_activities: List[SettlementActivityPublic] = [] # New field
|
||||
paid_at: Optional[datetime] = None
|
||||
settlement_activities: List[SettlementActivityPublic] = []
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# --- Expense Schemas ---
|
||||
class RecurrencePatternBase(BaseModel):
|
||||
type: str = Field(..., description="Type of recurrence: daily, weekly, monthly, yearly")
|
||||
interval: int = Field(..., description="Interval of recurrence (e.g., every X days/weeks/months/years)")
|
||||
@ -63,16 +56,13 @@ class ExpenseBase(BaseModel):
|
||||
expense_date: Optional[datetime] = None
|
||||
split_type: SplitTypeEnum
|
||||
list_id: Optional[int] = None
|
||||
group_id: Optional[int] = None # Should be present if list_id is not, and vice-versa
|
||||
group_id: Optional[int] = None
|
||||
item_id: Optional[int] = None
|
||||
paid_by_user_id: int
|
||||
is_recurring: bool = Field(False, description="Whether this is a recurring expense")
|
||||
recurrence_pattern: Optional[RecurrencePatternCreate] = Field(None, description="Recurrence pattern for recurring expenses")
|
||||
|
||||
class ExpenseCreate(ExpenseBase):
|
||||
# For EQUAL split, splits are generated. For others, they might be provided.
|
||||
# This logic will be in the CRUD: if split_type is EXACT_AMOUNTS, PERCENTAGE, SHARES,
|
||||
# then 'splits_in' should be provided.
|
||||
splits_in: Optional[List[ExpenseSplitCreate]] = None
|
||||
|
||||
@validator('total_amount')
|
||||
@ -81,8 +71,6 @@ class ExpenseCreate(ExpenseBase):
|
||||
raise ValueError('Total amount must be positive')
|
||||
return v
|
||||
|
||||
# Basic validation: if list_id is None, group_id must be provided.
|
||||
# More complex cross-field validation might be needed.
|
||||
@validator('group_id', always=True)
|
||||
def check_list_or_group_id(cls, v, values):
|
||||
if values.get('list_id') is None and v is None:
|
||||
@ -106,9 +94,7 @@ class ExpenseUpdate(BaseModel):
|
||||
list_id: Optional[int] = None
|
||||
group_id: Optional[int] = None
|
||||
item_id: Optional[int] = None
|
||||
# paid_by_user_id is usually not updatable directly to maintain integrity.
|
||||
# Updating splits would be a more complex operation, potentially a separate endpoint or careful logic.
|
||||
version: int # For optimistic locking
|
||||
version: int
|
||||
is_recurring: Optional[bool] = None
|
||||
recurrence_pattern: Optional[RecurrencePatternUpdate] = None
|
||||
next_occurrence: Optional[datetime] = None
|
||||
@ -120,11 +106,8 @@ class ExpensePublic(ExpenseBase):
|
||||
version: int
|
||||
created_by_user_id: int
|
||||
splits: List[ExpenseSplitPublic] = []
|
||||
paid_by_user: Optional[UserPublic] = None # If nesting user details
|
||||
overall_settlement_status: ExpenseOverallStatusEnum # New field
|
||||
# list: Optional[ListPublic] # If nesting list details
|
||||
# group: Optional[GroupPublic] # If nesting group details
|
||||
# item: Optional[ItemPublic] # If nesting item details
|
||||
paid_by_user: Optional[UserPublic] = None
|
||||
overall_settlement_status: ExpenseOverallStatusEnum
|
||||
is_recurring: bool
|
||||
next_occurrence: Optional[datetime]
|
||||
last_occurrence: Optional[datetime]
|
||||
@ -133,7 +116,6 @@ class ExpensePublic(ExpenseBase):
|
||||
generated_expenses: List['ExpensePublic'] = []
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# --- Settlement Schemas ---
|
||||
class SettlementBase(BaseModel):
|
||||
group_id: int
|
||||
paid_by_user_id: int
|
||||
@ -159,8 +141,7 @@ class SettlementUpdate(BaseModel):
|
||||
amount: Optional[Decimal] = None
|
||||
settlement_date: Optional[datetime] = None
|
||||
description: Optional[str] = None
|
||||
# group_id, paid_by_user_id, paid_to_user_id are typically not updatable.
|
||||
version: int # For optimistic locking
|
||||
version: int
|
||||
|
||||
class SettlementPublic(SettlementBase):
|
||||
id: int
|
||||
@ -168,13 +149,4 @@ class SettlementPublic(SettlementBase):
|
||||
updated_at: datetime
|
||||
version: int
|
||||
created_by_user_id: int
|
||||
# payer: Optional[UserPublic] # If we want to include payer details
|
||||
# payee: Optional[UserPublic] # If we want to include payee details
|
||||
# group: Optional[GroupPublic] # If we want to include group details
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Placeholder for nested schemas (e.g., UserPublic) if needed
|
||||
# from app.schemas.user import UserPublic
|
||||
# from app.schemas.list import ListPublic
|
||||
# from app.schemas.group import GroupPublic
|
||||
# from app.schemas.item import ItemPublic
|
9
be/app/schemas/financials.py
Normal file
9
be/app/schemas/financials.py
Normal file
@ -0,0 +1,9 @@
|
||||
from pydantic import BaseModel
|
||||
from typing import Union, List
|
||||
from .expense import ExpensePublic, SettlementPublic
|
||||
|
||||
class FinancialActivityResponse(BaseModel):
|
||||
activities: List[Union[ExpensePublic, SettlementPublic]]
|
||||
|
||||
class Config:
|
||||
orm_mode = True
|
@ -1,21 +1,27 @@
|
||||
# app/schemas/group.py
|
||||
from pydantic import BaseModel, ConfigDict, computed_field
|
||||
from datetime import datetime
|
||||
from datetime import datetime, date
|
||||
from typing import Optional, List
|
||||
from .user import UserPublic
|
||||
from .chore import ChoreHistoryPublic
|
||||
|
||||
from .user import UserPublic # Import UserPublic to represent members
|
||||
|
||||
# Properties to receive via API on creation
|
||||
class GroupCreate(BaseModel):
|
||||
name: str
|
||||
|
||||
# Properties to return to client
|
||||
class GroupDelete(BaseModel):
|
||||
confirmation_name: str
|
||||
|
||||
class GroupScheduleGenerateRequest(BaseModel):
|
||||
start_date: date
|
||||
end_date: date
|
||||
member_ids: Optional[List[int]] = None
|
||||
|
||||
class GroupPublic(BaseModel):
|
||||
id: int
|
||||
name: str
|
||||
created_by_id: int
|
||||
created_at: datetime
|
||||
member_associations: Optional[List["UserGroupPublic"]] = None
|
||||
chore_history: Optional[List[ChoreHistoryPublic]] = []
|
||||
|
||||
@computed_field
|
||||
@property
|
||||
@ -26,7 +32,6 @@ class GroupPublic(BaseModel):
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Properties for UserGroup association
|
||||
class UserGroupPublic(BaseModel):
|
||||
id: int
|
||||
user_id: int
|
||||
@ -37,6 +42,4 @@ class UserGroupPublic(BaseModel):
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Properties stored in DB (if needed, often GroupPublic is sufficient)
|
||||
# class GroupInDB(GroupPublic):
|
||||
# pass
|
||||
GroupPublic.model_rebuild()
|
@ -1,4 +1,4 @@
|
||||
# app/schemas/health.py
|
||||
|
||||
from pydantic import BaseModel
|
||||
from app.config import settings
|
||||
|
||||
@ -6,5 +6,5 @@ class HealthStatus(BaseModel):
|
||||
"""
|
||||
Response model for the health check endpoint.
|
||||
"""
|
||||
status: str = settings.HEALTH_STATUS_OK # Use configured default value
|
||||
status: str = settings.HEALTH_STATUS_OK
|
||||
database: str
|
@ -1,12 +1,9 @@
|
||||
# app/schemas/invite.py
|
||||
from pydantic import BaseModel
|
||||
from datetime import datetime
|
||||
|
||||
# Properties to receive when accepting an invite
|
||||
class InviteAccept(BaseModel):
|
||||
code: str
|
||||
|
||||
# Properties to return when an invite is created
|
||||
class InviteCodePublic(BaseModel):
|
||||
code: str
|
||||
expires_at: datetime
|
||||
|
@ -1,10 +1,13 @@
|
||||
# app/schemas/item.py
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
from decimal import Decimal
|
||||
|
||||
# Properties to return to client
|
||||
class UserReference(BaseModel):
|
||||
id: int
|
||||
name: Optional[str] = None
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
class ItemPublic(BaseModel):
|
||||
id: int
|
||||
list_id: int
|
||||
@ -12,26 +15,26 @@ class ItemPublic(BaseModel):
|
||||
quantity: Optional[str] = None
|
||||
is_complete: bool
|
||||
price: Optional[Decimal] = None
|
||||
category_id: Optional[int] = None
|
||||
added_by_id: int
|
||||
completed_by_id: Optional[int] = None
|
||||
added_by_user: Optional[UserReference] = None
|
||||
completed_by_user: Optional[UserReference] = None
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
version: int
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Properties to receive via API on creation
|
||||
class ItemCreate(BaseModel):
|
||||
name: str
|
||||
quantity: Optional[str] = None
|
||||
# list_id will be from path param
|
||||
# added_by_id will be from current_user
|
||||
category_id: Optional[int] = None
|
||||
|
||||
# Properties to receive via API on update
|
||||
class ItemUpdate(BaseModel):
|
||||
name: Optional[str] = None
|
||||
quantity: Optional[str] = None
|
||||
is_complete: Optional[bool] = None
|
||||
price: Optional[Decimal] = None # Price added here for update
|
||||
position: Optional[int] = None # For reordering
|
||||
price: Optional[Decimal] = None
|
||||
position: Optional[int] = None
|
||||
category_id: Optional[int] = None
|
||||
version: int
|
||||
# completed_by_id will be set internally if is_complete is true
|
@ -1,25 +1,20 @@
|
||||
# app/schemas/list.py
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from datetime import datetime
|
||||
from typing import Optional, List
|
||||
|
||||
from .item import ItemPublic # Import item schema for nesting
|
||||
from .item import ItemPublic
|
||||
|
||||
# Properties to receive via API on creation
|
||||
class ListCreate(BaseModel):
|
||||
name: str
|
||||
description: Optional[str] = None
|
||||
group_id: Optional[int] = None # Optional for sharing
|
||||
group_id: Optional[int] = None
|
||||
|
||||
# Properties to receive via API on update
|
||||
class ListUpdate(BaseModel):
|
||||
name: Optional[str] = None
|
||||
description: Optional[str] = None
|
||||
is_complete: Optional[bool] = None
|
||||
version: int # Client must provide the version for updates
|
||||
# Potentially add group_id update later if needed
|
||||
version: int
|
||||
|
||||
# Base properties returned by API (common fields)
|
||||
class ListBase(BaseModel):
|
||||
id: int
|
||||
name: str
|
||||
@ -29,17 +24,15 @@ class ListBase(BaseModel):
|
||||
is_complete: bool
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
version: int # Include version in responses
|
||||
version: int
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Properties returned when listing lists (no items)
|
||||
class ListPublic(ListBase):
|
||||
pass # Inherits all from ListBase
|
||||
pass
|
||||
|
||||
# Properties returned for a single list detail (includes items)
|
||||
class ListDetail(ListBase):
|
||||
items: List[ItemPublic] = [] # Include list of items
|
||||
items: List[ItemPublic] = []
|
||||
|
||||
class ListStatus(BaseModel):
|
||||
updated_at: datetime
|
||||
|
@ -1,4 +1,3 @@
|
||||
# app/schemas/message.py
|
||||
from pydantic import BaseModel
|
||||
|
||||
class Message(BaseModel):
|
||||
|
@ -1,6 +1,5 @@
|
||||
# app/schemas/ocr.py
|
||||
from pydantic import BaseModel
|
||||
from typing import List
|
||||
|
||||
class OcrExtractResponse(BaseModel):
|
||||
extracted_items: List[str] # A list of potential item names
|
||||
extracted_items: List[str]
|
35
be/app/schemas/recurrence.py
Normal file
35
be/app/schemas/recurrence.py
Normal file
@ -0,0 +1,35 @@
|
||||
from pydantic import BaseModel, validator
|
||||
from typing import Optional, List
|
||||
from datetime import datetime
|
||||
|
||||
class RecurrencePatternBase(BaseModel):
|
||||
type: str
|
||||
interval: int = 1
|
||||
days_of_week: Optional[List[int]] = None
|
||||
end_date: Optional[datetime] = None
|
||||
max_occurrences: Optional[int] = None
|
||||
|
||||
@validator('type')
|
||||
def type_must_be_valid(cls, v):
|
||||
if v not in ['daily', 'weekly', 'monthly', 'yearly']:
|
||||
raise ValueError("type must be one of 'daily', 'weekly', 'monthly', 'yearly'")
|
||||
return v
|
||||
|
||||
@validator('days_of_week')
|
||||
def days_of_week_must_be_valid(cls, v):
|
||||
if v:
|
||||
for day in v:
|
||||
if not 0 <= day <= 6:
|
||||
raise ValueError("days_of_week must be between 0 and 6")
|
||||
return v
|
||||
|
||||
class RecurrencePatternCreate(RecurrencePatternBase):
|
||||
pass
|
||||
|
||||
class RecurrencePatternPublic(RecurrencePatternBase):
|
||||
id: int
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
class Config:
|
||||
orm_mode = True
|
@ -3,7 +3,7 @@ from typing import Optional, List
|
||||
from decimal import Decimal
|
||||
from datetime import datetime
|
||||
|
||||
from app.schemas.user import UserPublic # Assuming UserPublic is defined here
|
||||
from app.schemas.user import UserPublic
|
||||
|
||||
class SettlementActivityBase(BaseModel):
|
||||
expense_split_id: int
|
||||
@ -21,23 +21,13 @@ class SettlementActivityCreate(SettlementActivityBase):
|
||||
|
||||
class SettlementActivityPublic(SettlementActivityBase):
|
||||
id: int
|
||||
created_by_user_id: int # User who recorded this activity
|
||||
created_by_user_id: int
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
payer: Optional[UserPublic] = None # User who made this part of the payment
|
||||
creator: Optional[UserPublic] = None # User who recorded this activity
|
||||
payer: Optional[UserPublic] = None
|
||||
creator: Optional[UserPublic] = None
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Schema for updating a settlement activity (if needed in the future)
|
||||
# class SettlementActivityUpdate(BaseModel):
|
||||
# amount_paid: Optional[Decimal] = None
|
||||
# paid_at: Optional[datetime] = None
|
||||
|
||||
# @field_validator('amount_paid')
|
||||
# @classmethod
|
||||
# def amount_must_be_positive_if_provided(cls, v: Optional[Decimal]) -> Optional[Decimal]:
|
||||
# if v is not None and v <= Decimal("0"):
|
||||
# raise ValueError("Amount paid must be a positive value.")
|
||||
# return v
|
||||
|
22
be/app/schemas/time_entry.py
Normal file
22
be/app/schemas/time_entry.py
Normal file
@ -0,0 +1,22 @@
|
||||
from pydantic import BaseModel
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
class TimeEntryBase(BaseModel):
|
||||
chore_assignment_id: int
|
||||
start_time: datetime
|
||||
end_time: Optional[datetime] = None
|
||||
duration_seconds: Optional[int] = None
|
||||
|
||||
class TimeEntryCreate(TimeEntryBase):
|
||||
pass
|
||||
|
||||
class TimeEntryUpdate(BaseModel):
|
||||
end_time: datetime
|
||||
|
||||
class TimeEntryPublic(TimeEntryBase):
|
||||
id: int
|
||||
user_id: int
|
||||
|
||||
class Config:
|
||||
orm_mode = True
|
8
be/app/schemas/token.py
Normal file
8
be/app/schemas/token.py
Normal file
@ -0,0 +1,8 @@
|
||||
from pydantic import BaseModel
|
||||
|
||||
class Token(BaseModel):
|
||||
access_token: str
|
||||
token_type: str
|
||||
|
||||
class TokenData(BaseModel):
|
||||
email: str | None = None
|
@ -1,14 +1,11 @@
|
||||
# app/schemas/user.py
|
||||
from pydantic import BaseModel, EmailStr, ConfigDict
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
# Shared properties
|
||||
class UserBase(BaseModel):
|
||||
email: EmailStr
|
||||
name: Optional[str] = None
|
||||
|
||||
# Properties to receive via API on creation
|
||||
class UserCreate(UserBase):
|
||||
password: str
|
||||
|
||||
@ -22,26 +19,26 @@ class UserCreate(UserBase):
|
||||
"is_verified": False
|
||||
}
|
||||
|
||||
# Properties to receive via API on update
|
||||
class UserUpdate(UserBase):
|
||||
password: Optional[str] = None
|
||||
is_active: Optional[bool] = None
|
||||
is_superuser: Optional[bool] = None
|
||||
is_verified: Optional[bool] = None
|
||||
|
||||
# Properties stored in DB
|
||||
class UserClaim(BaseModel):
|
||||
email: EmailStr
|
||||
password: str
|
||||
|
||||
class UserInDBBase(UserBase):
|
||||
id: int
|
||||
password_hash: str
|
||||
created_at: datetime
|
||||
model_config = ConfigDict(from_attributes=True) # Use orm_mode in Pydantic v1
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Additional properties to return via API (excluding password)
|
||||
class UserPublic(UserBase):
|
||||
id: int
|
||||
created_at: datetime
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
# Full user model including hashed password (for internal use/reading from DB)
|
||||
class User(UserInDBBase):
|
||||
pass
|
343
be/app/services/costs_service.py
Normal file
343
be/app/services/costs_service.py
Normal file
@ -0,0 +1,343 @@
|
||||
# be/app/services/costs_service.py
|
||||
import logging
|
||||
from decimal import Decimal, ROUND_HALF_UP, ROUND_DOWN
|
||||
from typing import List
|
||||
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import selectinload
|
||||
|
||||
from app.models import (
|
||||
User as UserModel,
|
||||
Group as GroupModel,
|
||||
List as ListModel,
|
||||
Expense as ExpenseModel,
|
||||
Item as ItemModel,
|
||||
UserGroup as UserGroupModel,
|
||||
SplitTypeEnum,
|
||||
ExpenseSplit as ExpenseSplitModel,
|
||||
SettlementActivity as SettlementActivityModel,
|
||||
Settlement as SettlementModel
|
||||
)
|
||||
from app.schemas.cost import ListCostSummary, GroupBalanceSummary, UserCostShare, UserBalanceDetail, SuggestedSettlement
|
||||
from app.schemas.expense import ExpenseCreate, ExpensePublic
|
||||
from app.crud import list as crud_list
|
||||
from app.crud import expense as crud_expense
|
||||
from app.core.exceptions import ListNotFoundError, ListPermissionError, GroupNotFoundError, GroupPermissionError, InvalidOperationError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def calculate_suggested_settlements(user_balances: List[UserBalanceDetail]) -> List[SuggestedSettlement]:
|
||||
"""
|
||||
Calculate suggested settlements to balance the finances within a group.
|
||||
|
||||
This function takes the current balances of all users and suggests optimal settlements
|
||||
to minimize the number of transactions needed to settle all debts.
|
||||
|
||||
Args:
|
||||
user_balances: List of UserBalanceDetail objects with their current balances
|
||||
|
||||
Returns:
|
||||
List of SuggestedSettlement objects representing the suggested payments
|
||||
"""
|
||||
debtors = []
|
||||
creditors = []
|
||||
epsilon = Decimal('0.01')
|
||||
|
||||
for user in user_balances:
|
||||
if abs(user.net_balance) < epsilon:
|
||||
continue
|
||||
|
||||
if user.net_balance < Decimal('0'):
|
||||
debtors.append({
|
||||
'user_id': user.user_id,
|
||||
'user_identifier': user.user_identifier,
|
||||
'amount': -user.net_balance
|
||||
})
|
||||
else:
|
||||
creditors.append({
|
||||
'user_id': user.user_id,
|
||||
'user_identifier': user.user_identifier,
|
||||
'amount': user.net_balance
|
||||
})
|
||||
|
||||
debtors.sort(key=lambda x: x['amount'], reverse=True)
|
||||
creditors.sort(key=lambda x: x['amount'], reverse=True)
|
||||
|
||||
settlements = []
|
||||
|
||||
while debtors and creditors:
|
||||
debtor = debtors[0]
|
||||
creditor = creditors[0]
|
||||
|
||||
amount = min(debtor['amount'], creditor['amount']).quantize(Decimal('0.01'), rounding=ROUND_HALF_UP)
|
||||
|
||||
if amount > Decimal('0'):
|
||||
settlements.append(
|
||||
SuggestedSettlement(
|
||||
from_user_id=debtor['user_id'],
|
||||
from_user_identifier=debtor['user_identifier'],
|
||||
to_user_id=creditor['user_id'],
|
||||
to_user_identifier=creditor['user_identifier'],
|
||||
amount=amount
|
||||
)
|
||||
)
|
||||
|
||||
debtor['amount'] -= amount
|
||||
creditor['amount'] -= amount
|
||||
|
||||
if debtor['amount'] < epsilon:
|
||||
debtors.pop(0)
|
||||
if creditor['amount'] < epsilon:
|
||||
creditors.pop(0)
|
||||
|
||||
return settlements
|
||||
|
||||
|
||||
async def get_list_cost_summary_logic(
|
||||
db: AsyncSession, list_id: int, current_user_id: int
|
||||
) -> ListCostSummary:
|
||||
"""
|
||||
Core logic to retrieve a calculated cost summary for a specific list.
|
||||
This version does NOT create an expense if one is not found.
|
||||
"""
|
||||
await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user_id)
|
||||
|
||||
list_result = await db.execute(
|
||||
select(ListModel)
|
||||
.options(
|
||||
selectinload(ListModel.items).options(selectinload(ItemModel.added_by_user)),
|
||||
selectinload(ListModel.group).options(selectinload(GroupModel.member_associations).options(selectinload(UserGroupModel.user))),
|
||||
selectinload(ListModel.creator)
|
||||
)
|
||||
.where(ListModel.id == list_id)
|
||||
)
|
||||
db_list = list_result.scalars().first()
|
||||
if not db_list:
|
||||
raise ListNotFoundError(list_id)
|
||||
|
||||
expense_result = await db.execute(
|
||||
select(ExpenseModel)
|
||||
.where(ExpenseModel.list_id == list_id)
|
||||
.options(selectinload(ExpenseModel.splits).options(selectinload(ExpenseSplitModel.user)))
|
||||
)
|
||||
db_expense = expense_result.scalars().first()
|
||||
|
||||
total_list_cost = sum(item.price for item in db_list.items if item.price is not None and item.price > Decimal("0"))
|
||||
|
||||
# If no expense exists or no items with cost, return a summary based on item prices alone.
|
||||
if not db_expense:
|
||||
return ListCostSummary(
|
||||
list_id=db_list.id,
|
||||
list_name=db_list.name,
|
||||
total_list_cost=total_list_cost.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
num_participating_users=0,
|
||||
equal_share_per_user=Decimal("0.00"),
|
||||
user_balances=[]
|
||||
)
|
||||
|
||||
# --- Calculation logic based on existing expense ---
|
||||
participating_users = set()
|
||||
user_items_added_value = {}
|
||||
|
||||
for item in db_list.items:
|
||||
if item.price is not None and item.price > Decimal("0") and item.added_by_user:
|
||||
participating_users.add(item.added_by_user)
|
||||
user_items_added_value[item.added_by_user.id] = user_items_added_value.get(item.added_by_user.id, Decimal("0.00")) + item.price
|
||||
|
||||
for split in db_expense.splits:
|
||||
if split.user:
|
||||
participating_users.add(split.user)
|
||||
|
||||
num_participating_users = len(participating_users)
|
||||
if num_participating_users == 0:
|
||||
return ListCostSummary(
|
||||
list_id=db_list.id,
|
||||
list_name=db_list.name,
|
||||
total_list_cost=total_list_cost.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
num_participating_users=0,
|
||||
equal_share_per_user=Decimal("0.00"),
|
||||
user_balances=[]
|
||||
)
|
||||
|
||||
equal_share_per_user_for_response = (db_expense.total_amount / Decimal(num_participating_users)).quantize(Decimal("0.01"), rounding=ROUND_HALF_UP)
|
||||
|
||||
sorted_participating_users = sorted(list(participating_users), key=lambda u: u.id)
|
||||
user_final_shares = {}
|
||||
|
||||
if num_participating_users > 0:
|
||||
base_share_unrounded = db_expense.total_amount / Decimal(num_participating_users)
|
||||
for user in sorted_participating_users:
|
||||
user_final_shares[user.id] = base_share_unrounded.quantize(Decimal("0.01"), rounding=ROUND_DOWN)
|
||||
|
||||
sum_of_rounded_shares = sum(user_final_shares.values())
|
||||
remaining_pennies = int(((db_expense.total_amount - sum_of_rounded_shares) * Decimal("100")).to_integral_value(rounding=ROUND_HALF_UP))
|
||||
|
||||
for i in range(remaining_pennies):
|
||||
user_to_adjust = sorted_participating_users[i % num_participating_users]
|
||||
user_final_shares[user_to_adjust.id] += Decimal("0.01")
|
||||
|
||||
user_balances = []
|
||||
for user in sorted_participating_users:
|
||||
items_added = user_items_added_value.get(user.id, Decimal("0.00"))
|
||||
current_user_share = user_final_shares.get(user.id, Decimal("0.00"))
|
||||
balance = items_added - current_user_share
|
||||
user_identifier = user.name if user.name else user.email
|
||||
user_balances.append(
|
||||
UserCostShare(
|
||||
user_id=user.id,
|
||||
user_identifier=user_identifier,
|
||||
items_added_value=items_added.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
amount_due=current_user_share.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
balance=balance.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP)
|
||||
)
|
||||
)
|
||||
|
||||
user_balances.sort(key=lambda x: x.user_identifier)
|
||||
return ListCostSummary(
|
||||
list_id=db_list.id,
|
||||
list_name=db_list.name,
|
||||
total_list_cost=db_expense.total_amount.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
num_participating_users=num_participating_users,
|
||||
equal_share_per_user=equal_share_per_user_for_response,
|
||||
user_balances=user_balances
|
||||
)
|
||||
|
||||
|
||||
async def generate_expense_from_list_logic(db: AsyncSession, list_id: int, current_user_id: int) -> ExpenseModel:
|
||||
"""
|
||||
Generates and saves an ITEM_BASED expense from a list's items.
|
||||
"""
|
||||
await crud_list.check_list_permission(db=db, list_id=list_id, user_id=current_user_id)
|
||||
|
||||
# Check if an expense already exists for this list
|
||||
existing_expense_result = await db.execute(
|
||||
select(ExpenseModel).where(ExpenseModel.list_id == list_id)
|
||||
)
|
||||
if existing_expense_result.scalars().first():
|
||||
raise InvalidOperationError(f"An expense already exists for list {list_id}.")
|
||||
|
||||
db_list = await db.get(ListModel, list_id, options=[selectinload(ListModel.items), selectinload(ListModel.creator)])
|
||||
if not db_list:
|
||||
raise ListNotFoundError(list_id)
|
||||
|
||||
total_amount = sum(item.price for item in db_list.items if item.price is not None and item.price > Decimal("0"))
|
||||
if total_amount <= Decimal("0"):
|
||||
raise InvalidOperationError("Cannot create an expense for a list with no priced items.")
|
||||
|
||||
expense_in = ExpenseCreate(
|
||||
description=f"Cost summary for list {db_list.name}",
|
||||
total_amount=total_amount,
|
||||
list_id=list_id,
|
||||
split_type=SplitTypeEnum.ITEM_BASED,
|
||||
paid_by_user_id=db_list.creator.id
|
||||
)
|
||||
return await crud_expense.create_expense(db=db, expense_in=expense_in, current_user_id=current_user_id)
|
||||
|
||||
|
||||
async def get_group_balance_summary_logic(
|
||||
db: AsyncSession, group_id: int, current_user_id: int
|
||||
) -> GroupBalanceSummary:
|
||||
"""
|
||||
Core logic to retrieve a detailed financial balance summary for a group.
|
||||
"""
|
||||
group_check_result = await db.execute(
|
||||
select(GroupModel).options(selectinload(GroupModel.member_associations).options(selectinload(UserGroupModel.user)))
|
||||
.where(GroupModel.id == group_id)
|
||||
)
|
||||
db_group = group_check_result.scalars().first()
|
||||
|
||||
if not db_group:
|
||||
raise GroupNotFoundError(group_id)
|
||||
|
||||
if not any(assoc.user_id == current_user_id for assoc in db_group.member_associations):
|
||||
raise GroupPermissionError(group_id, "view balance summary for")
|
||||
|
||||
expenses_result = await db.execute(
|
||||
select(ExpenseModel).where(ExpenseModel.group_id == group_id)
|
||||
.options(selectinload(ExpenseModel.splits).selectinload(ExpenseSplitModel.user))
|
||||
)
|
||||
expenses = expenses_result.scalars().all()
|
||||
|
||||
settlements_result = await db.execute(
|
||||
select(SettlementModel).where(SettlementModel.group_id == group_id)
|
||||
.options(selectinload(SettlementModel.paid_by_user), selectinload(SettlementModel.paid_to_user))
|
||||
)
|
||||
settlements = settlements_result.scalars().all()
|
||||
|
||||
settlement_activities_result = await db.execute(
|
||||
select(SettlementActivityModel)
|
||||
.join(ExpenseSplitModel, SettlementActivityModel.expense_split_id == ExpenseSplitModel.id)
|
||||
.join(ExpenseModel, ExpenseSplitModel.expense_id == ExpenseModel.id)
|
||||
.where(ExpenseModel.group_id == group_id)
|
||||
.options(selectinload(SettlementActivityModel.payer))
|
||||
)
|
||||
settlement_activities = settlement_activities_result.scalars().all()
|
||||
|
||||
user_balances_data = {}
|
||||
for assoc in db_group.member_associations:
|
||||
if assoc.user:
|
||||
user_balances_data[assoc.user.id] = {
|
||||
"user_id": assoc.user.id,
|
||||
"user_identifier": assoc.user.name if assoc.user.name else assoc.user.email,
|
||||
"total_paid_for_expenses": Decimal("0.00"),
|
||||
"initial_total_share_of_expenses": Decimal("0.00"),
|
||||
"total_amount_paid_via_settlement_activities": Decimal("0.00"),
|
||||
"total_generic_settlements_paid": Decimal("0.00"),
|
||||
"total_generic_settlements_received": Decimal("0.00"),
|
||||
}
|
||||
|
||||
for expense in expenses:
|
||||
if expense.paid_by_user_id in user_balances_data:
|
||||
user_balances_data[expense.paid_by_user_id]["total_paid_for_expenses"] += expense.total_amount
|
||||
for split in expense.splits:
|
||||
if split.user_id in user_balances_data:
|
||||
user_balances_data[split.user_id]["initial_total_share_of_expenses"] += split.owed_amount
|
||||
|
||||
for activity in settlement_activities:
|
||||
if activity.paid_by_user_id in user_balances_data:
|
||||
user_balances_data[activity.paid_by_user_id]["total_amount_paid_via_settlement_activities"] += activity.amount_paid
|
||||
|
||||
for settlement in settlements:
|
||||
if settlement.paid_by_user_id in user_balances_data:
|
||||
user_balances_data[settlement.paid_by_user_id]["total_generic_settlements_paid"] += settlement.amount
|
||||
if settlement.paid_to_user_id in user_balances_data:
|
||||
user_balances_data[settlement.paid_to_user_id]["total_generic_settlements_received"] += settlement.amount
|
||||
|
||||
final_user_balances = []
|
||||
for user_id, data in user_balances_data.items():
|
||||
initial_total_share_of_expenses = data["initial_total_share_of_expenses"]
|
||||
total_amount_paid_via_settlement_activities = data["total_amount_paid_via_settlement_activities"]
|
||||
adjusted_total_share_of_expenses = initial_total_share_of_expenses - total_amount_paid_via_settlement_activities
|
||||
total_paid_for_expenses = data["total_paid_for_expenses"]
|
||||
total_generic_settlements_received = data["total_generic_settlements_received"]
|
||||
total_generic_settlements_paid = data["total_generic_settlements_paid"]
|
||||
net_balance = (
|
||||
total_paid_for_expenses + total_generic_settlements_received
|
||||
) - (adjusted_total_share_of_expenses + total_generic_settlements_paid)
|
||||
|
||||
user_detail = UserBalanceDetail(
|
||||
user_id=data["user_id"],
|
||||
user_identifier=data["user_identifier"],
|
||||
total_paid_for_expenses=total_paid_for_expenses.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
total_share_of_expenses=adjusted_total_share_of_expenses.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
total_settlements_paid=total_generic_settlements_paid.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
total_settlements_received=total_generic_settlements_received.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
net_balance=net_balance.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP)
|
||||
)
|
||||
final_user_balances.append(user_detail)
|
||||
|
||||
final_user_balances.sort(key=lambda x: x.user_identifier)
|
||||
suggested_settlements = calculate_suggested_settlements(final_user_balances)
|
||||
overall_total_expenses = sum(expense.total_amount for expense in expenses)
|
||||
overall_total_settlements = sum(settlement.amount for settlement in settlements)
|
||||
|
||||
return GroupBalanceSummary(
|
||||
group_id=db_group.id,
|
||||
group_name=db_group.name,
|
||||
overall_total_expenses=overall_total_expenses.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
overall_total_settlements=overall_total_settlements.quantize(Decimal("0.01"), rounding=ROUND_HALF_UP),
|
||||
user_balances=final_user_balances,
|
||||
suggested_settlements=suggested_settlements
|
||||
)
|
31
be/app/services/financials_service.py
Normal file
31
be/app/services/financials_service.py
Normal file
@ -0,0 +1,31 @@
|
||||
import logging
|
||||
from typing import List, Union
|
||||
from datetime import datetime
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from app.models import Expense as ExpenseModel, Settlement as SettlementModel
|
||||
from app.crud import expense as crud_expense, settlement as crud_settlement
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
async def get_user_financial_activity(
|
||||
db: AsyncSession, user_id: int
|
||||
) -> List[Union[ExpenseModel, SettlementModel]]:
|
||||
"""
|
||||
Retrieves and merges all financial activities (expenses and settlements) for a user.
|
||||
The combined list is sorted by date.
|
||||
"""
|
||||
# Fetch all accessible expenses
|
||||
expenses = await crud_expense.get_user_accessible_expenses(db, user_id=user_id, limit=200) # Using a generous limit
|
||||
|
||||
# Fetch all settlements involving the user
|
||||
settlements = await crud_settlement.get_settlements_involving_user(db, user_id=user_id, limit=200) # Using a generous limit
|
||||
|
||||
# Combine and sort the activities
|
||||
# We use a lambda to get the primary date for sorting from either type of object
|
||||
combined_activity = sorted(
|
||||
expenses + settlements,
|
||||
key=lambda x: x.expense_date if isinstance(x, ExpenseModel) else x.settlement_date,
|
||||
reverse=True
|
||||
)
|
||||
|
||||
return combined_activity
|
@ -21,7 +21,5 @@ pytest>=7.4.0
|
||||
pytest-asyncio>=0.21.0
|
||||
pytest-cov>=4.1.0
|
||||
httpx>=0.24.0 # For async HTTP testing
|
||||
aiosqlite>=0.19.0 # For async SQLite support in tests
|
||||
|
||||
# Scheduler
|
||||
APScheduler==3.10.4
|
||||
redis>=5.0.0
|
@ -24,6 +24,7 @@ from app.crud.settlement_activity import (
|
||||
update_expense_overall_status # For direct testing if needed
|
||||
)
|
||||
from app.schemas.settlement_activity import SettlementActivityCreate as SettlementActivityCreateSchema
|
||||
from app.core.exceptions import OverpaymentError
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
@ -356,6 +357,73 @@ async def test_create_settlement_activity_overall_status_becomes_partially_paid(
|
||||
# Since one split is paid and the other is unpaid, the overall expense status should be partially_paid
|
||||
assert test_expense.overall_settlement_status == ExpenseOverallStatusEnum.partially_paid
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_create_settlement_activity_overpayment_protection(
|
||||
db_session: AsyncSession, test_user2: User, test_expense_split_user2_owes: ExpenseSplit
|
||||
):
|
||||
"""Test that settlement activities prevent overpayment beyond owed amount."""
|
||||
# Test split owes 10.00, attempt to pay 15.00 directly
|
||||
activity_data = SettlementActivityCreateSchema(
|
||||
expense_split_id=test_expense_split_user2_owes.id,
|
||||
paid_by_user_id=test_user2.id,
|
||||
amount_paid=Decimal("15.00") # More than the 10.00 owed
|
||||
)
|
||||
|
||||
# Should raise OverpaymentError
|
||||
with pytest.raises(OverpaymentError):
|
||||
await create_settlement_activity(
|
||||
db=db_session,
|
||||
settlement_activity_in=activity_data,
|
||||
current_user_id=test_user2.id
|
||||
)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_create_settlement_activity_overpayment_protection_partial(
|
||||
db_session: AsyncSession, test_user2: User, test_expense_split_user2_owes: ExpenseSplit
|
||||
):
|
||||
"""Test overpayment protection with multiple payments."""
|
||||
# First payment of 7.00
|
||||
activity1_data = SettlementActivityCreateSchema(
|
||||
expense_split_id=test_expense_split_user2_owes.id,
|
||||
paid_by_user_id=test_user2.id,
|
||||
amount_paid=Decimal("7.00")
|
||||
)
|
||||
|
||||
activity1 = await create_settlement_activity(
|
||||
db=db_session,
|
||||
settlement_activity_in=activity1_data,
|
||||
current_user_id=test_user2.id
|
||||
)
|
||||
assert activity1 is not None
|
||||
|
||||
# Second payment of 5.00 should be rejected (7 + 5 = 12 > 10 owed)
|
||||
activity2_data = SettlementActivityCreateSchema(
|
||||
expense_split_id=test_expense_split_user2_owes.id,
|
||||
paid_by_user_id=test_user2.id,
|
||||
amount_paid=Decimal("5.00")
|
||||
)
|
||||
|
||||
with pytest.raises(OverpaymentError):
|
||||
await create_settlement_activity(
|
||||
db=db_session,
|
||||
settlement_activity_in=activity2_data,
|
||||
current_user_id=test_user2.id
|
||||
)
|
||||
|
||||
# But a payment of 3.00 should work (7 + 3 = 10 = exact amount owed)
|
||||
activity3_data = SettlementActivityCreateSchema(
|
||||
expense_split_id=test_expense_split_user2_owes.id,
|
||||
paid_by_user_id=test_user2.id,
|
||||
amount_paid=Decimal("3.00")
|
||||
)
|
||||
|
||||
activity3 = await create_settlement_activity(
|
||||
db=db_session,
|
||||
settlement_activity_in=activity3_data,
|
||||
current_user_id=test_user2.id
|
||||
)
|
||||
assert activity3 is not None
|
||||
|
||||
# Example of a placeholder for db_session fixture if not provided by conftest.py
|
||||
# @pytest.fixture
|
||||
# async def db_session() -> AsyncGenerator[AsyncSession, None]:
|
||||
|
318
be/todo.md
Normal file
318
be/todo.md
Normal file
@ -0,0 +1,318 @@
|
||||
# Backend Critical Issues & TODOs
|
||||
|
||||
This document outlines critical issues found in the backend codebase that require immediate attention. The issues range from severe data integrity risks to security vulnerabilities and poor API design.
|
||||
|
||||
---
|
||||
|
||||
### 1. Catastrophic Data Loss via Cascading Deletes
|
||||
|
||||
**Severity: CRITICAL**
|
||||
|
||||
**Problem:**
|
||||
The SQLAlchemy models in `be/app/models.py` are configured with `ondelete="CASCADE"` and `cascade="all, delete-orphan"` on relationships to critical shared data like `Expense`, `List`, `Settlement`, and `Chore`. This creates a chain reaction of data destruction that can be triggered by a single user's action, affecting the data of all other users in a group.
|
||||
|
||||
**Specific Scenarios:**
|
||||
|
||||
- **Deleting a User:** When a user is deleted, all expenses they paid for, created, or were a part of (via `ExpenseSplit`) are deleted. All settlements they participated in are also deleted. This permanently corrupts the financial history of every group the user was a member of, violating the data integrity for all other members.
|
||||
- **Deleting a Group:** Deleting a group permanently deletes **all** associated data, including every expense, shopping list, item, settlement, and the entire chore history.
|
||||
- **Deleting a List:** Deleting a shopping list deletes all associated expenses.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Immediately remove all `ondelete="CASCADE"` and `cascade="all, delete-orphan"` configurations from the models where they affect shared data.
|
||||
- Implement a **soft-delete** or **anonymization** strategy for user deletion to comply with data protection regulations (like GDPR's "right to be forgotten") without destroying the integrity of other users' data. When a user is "deleted," their PII should be scrubbed, but their ID and associated records should remain to keep financial history intact.
|
||||
|
||||
---
|
||||
|
||||
### 2. Hidden Group Deletion in "Leave Group" Endpoint
|
||||
|
||||
**Severity: CRITICAL**
|
||||
|
||||
**Problem:**
|
||||
There is no explicit `DELETE /groups/{group_id}` endpoint. Instead, the logic for deleting a group is hidden within the `DELETE /groups/{group_id}/leave` endpoint in `be/app/api/v1/endpoints/groups.py`. If the last remaining member of a group, who is also the owner, calls this endpoint, the entire group and all its associated data are silently and permanently deleted.
|
||||
|
||||
**Why this is a problem:**
|
||||
|
||||
- **Non-Obvious Destruction:** A user would never expect a "leave" action to be destructive for the entire group.
|
||||
- **No Confirmation:** There is no "are you sure?" check. The action is immediate and irreversible.
|
||||
- **High Risk of Accidental Deletion:** An owner could easily remove all members and then leave, accidentally wiping all data.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Create a separate, explicit `DELETE /groups/{group_id}` endpoint.
|
||||
- This new endpoint must be protected and should only be accessible to the group owner.
|
||||
- Implement multiple confirmation steps before allowing the deletion (e.g., requiring the user to type the group's name to confirm).
|
||||
- The "leave group" endpoint should **never** result in the deletion of the group. If the last member leaves, the group should be archived or left empty, but not destroyed.
|
||||
|
||||
---
|
||||
|
||||
### 3. User Deletion Triggers Data Cascade
|
||||
|
||||
**Severity: CRITICAL**
|
||||
|
||||
**Problem:**
|
||||
The user management routes in `be/app/api/v1/endpoints/users.py` use the `fastapi-users` library. The configuration in `be/app/auth.py` uses the default `BaseUserManager`, which performs a **hard delete** on the user record in the database.
|
||||
|
||||
**Why this is a problem:**
|
||||
This is the trigger for the catastrophic data loss described in issue #1. A simple call to the `DELETE /users/me` endpoint will initiate the cascade, destroying shared financial records across multiple groups. This is a direct violation of data integrity for other users.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Override the default `delete` method in the `UserManager` class in `be/app/auth.py`.
|
||||
- The new method must implement the anonymization strategy discussed in issue #1. It should scrub the user's PII and mark the account as inactive, but **not** delete the row from the database.
|
||||
|
||||
---
|
||||
|
||||
### 4. Inconsistent and Flawed Authorization in Financial Endpoints
|
||||
|
||||
**Severity: HIGH**
|
||||
|
||||
**Problem:**
|
||||
Authorization logic is inconsistent across the financial endpoints in `be/app/api/v1/endpoints/financials.py`, creating security gaps and potential for misuse.
|
||||
|
||||
**Specific Scenarios:**
|
||||
|
||||
- **Creating vs. Updating Expenses:** A regular user can't create an expense paid by someone else, but they _can_ modify an existing expense as long as they were the original payer. This is inconsistent. A better approach would be to only allow group **owners** to modify financial records created by others.
|
||||
- **Deleting Settlements:** The `delete_settlement_record` endpoint dangerously allows the user who _created_ the settlement record to delete it. A user could pay a debt, have the recipient confirm it, and then the payer could simply delete the record of the payment. Only group **owners** should have the power to delete financial records, and it should be a heavily audited action.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Establish a clear and consistent authorization policy. A good default policy would be:
|
||||
1. Users can create and modify their own financial entries (e.g., expenses they paid for).
|
||||
2. Only group **owners** can modify or delete financial entries created by other users.
|
||||
- Refactor all financial endpoints (`expenses`, `settlements`) to strictly and consistently enforce this policy.
|
||||
|
||||
---
|
||||
|
||||
### 5. No Protection Against Over-Settlement of Debts
|
||||
|
||||
**Severity: HIGH**
|
||||
|
||||
**Problem:**
|
||||
The `record_settlement_for_expense_split` endpoint in `financials.py` does not validate if the settlement amount exceeds the amount owed for that expense split.
|
||||
|
||||
**Why this is a problem:**
|
||||
A malicious user could exploit this by "settling" a $5 debt with a $500 payment. The system would record this, creating a false record that the original payer now owes the malicious user $495. This can be used to create phantom debts.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- In `crud_settlement_activity.create_settlement_activity`, add validation to ensure the `amount_paid` in a settlement activity does not exceed the remaining `owed_amount` on the `ExpenseSplit`.
|
||||
- If an overpayment is attempted, the API should return a `400 Bad Request` error with a clear message.
|
||||
|
||||
---
|
||||
|
||||
### 6. Flawed Business Logic Preventing Group Expenses from Personal Lists
|
||||
|
||||
**Severity: MEDIUM**
|
||||
|
||||
**Problem:**
|
||||
The `create_new_expense` endpoint prevents a user from creating a group expense that is associated with one of their personal shopping lists.
|
||||
|
||||
**Why this is a problem:**
|
||||
This doesn't reflect a real-world use case. A user might use a personal list to do shopping for a group and then need to file the expense against that group. The current logic forces an artificial separation.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Remove the validation check that blocks this action. A user should be able to link a group expense to any list they have access to, regardless of whether it's a personal or group list. The critical factor is the `group_id` on the expense itself, not on the list it's optionally linked to.
|
||||
|
||||
---
|
||||
|
||||
### 7. Unrestricted Access to All Chore History
|
||||
|
||||
**Severity: CRITICAL**
|
||||
|
||||
**Problem:**
|
||||
The chore history endpoints in `be/app/api/v1/endpoints/chores.py` lack any authorization checks.
|
||||
|
||||
**Specific Endpoints:**
|
||||
|
||||
- `GET /{chore_id}/history`
|
||||
- `GET /assignments/{assignment_id}/history`
|
||||
|
||||
Any authenticated user can access the historical data for _any_ chore or assignment in the entire database simply by guessing its ID. This allows users from one group to spy on the activities of other groups, which is a major data leak.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Add strict authorization checks to these endpoints immediately.
|
||||
- Before returning any history, the code must verify that the `current_user` is a member of the group to which the chore or assignment belongs.
|
||||
- For personal chores, it must verify that the `current_user` is the creator of the chore.
|
||||
|
||||
---
|
||||
|
||||
### 8. Redundant and Confusing Chore Endpoints
|
||||
|
||||
**Severity: LOW**
|
||||
|
||||
**Problem:**
|
||||
The API has a completely separate set of endpoints for "Personal Chores" (e.g., `POST /personal`, `DELETE /personal/{chore_id}`). It also provides generic endpoints (`PUT /{chore_id}`) that can perform the same actions and more, such as converting a personal chore to a group chore.
|
||||
|
||||
**Why this is a problem:**
|
||||
|
||||
- **API Bloat:** It makes the API larger and more confusing than necessary.
|
||||
- **Maintenance Burden:** Having duplicate logic increases the chance of bugs and makes the codebase harder to maintain. For example, if a business rule for updates changes, it might need to be updated in multiple places.
|
||||
- **Inconsistency:** The `update_personal_chore` and `update_group_chore` endpoints have slightly different logic and permission checks than the generic `update_chore_any_type` endpoint, which is a recipe for subtle bugs.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Deprecate and remove the separate `/personal/*` and `/groups/{group_id}/chores/*` endpoints for `POST`, `PUT`, and `DELETE`.
|
||||
- Consolidate all create, update, and delete logic into a single set of endpoints (e.g., `POST /chores`, `PUT /chores/{chore_id}`, `DELETE /chores/{chore_id}`).
|
||||
- This single set of endpoints should contain clear, unified logic that handles permissions based on the chore's `type` (personal or group).
|
||||
|
||||
---
|
||||
|
||||
### 9. Sensitive User Data Leaked in Application Logs
|
||||
|
||||
**Severity: MEDIUM**
|
||||
|
||||
**Problem:**
|
||||
Across multiple endpoints (`financials.py`, `groups.py`, `chores.py`, etc.) the code logs user email addresses and sometimes internal object IDs at the **INFO** or **WARNING** level. Example:
|
||||
|
||||
```python
|
||||
logger.info(f"User {current_user.email} creating expense: {expense_in.description}")
|
||||
```
|
||||
|
||||
Logging personally identifiable information (PII) such as email addresses at non-debug levels creates a compliance risk (GDPR/Datenschutz) and can leak data if log files are exposed.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Remove PII (emails, names) from log messages at INFO/WARNING/ERROR levels.
|
||||
- If user identifiers are required, use the numeric `user.id` or a hashed identifier.
|
||||
- Move any detailed, user‐specific logging to DEBUG level with redaction.
|
||||
|
||||
---
|
||||
|
||||
### 10. Detailed Internal Errors Returned to Clients
|
||||
|
||||
**Severity: MEDIUM**
|
||||
|
||||
**Problem:**
|
||||
Many `except` blocks convert low-level SQLAlchemy exceptions directly into HTTP 400/500 responses with `detail=str(e)`. This leaks internal table names, constraint names, and stack traces to the client.
|
||||
|
||||
**Example:** `crud/expense.py` and other CRUD modules.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Map all internal exceptions to generic error messages for clients (e.g., "Database error, please try again").
|
||||
- Log the original exception server-side (with redaction) but never expose raw error strings to API consumers.
|
||||
|
||||
---
|
||||
|
||||
### 11. Insecure Default Configuration Values
|
||||
|
||||
**Severity: HIGH**
|
||||
|
||||
**Problem:**
|
||||
`app/config.py` sets insecure defaults that may accidentally reach production:
|
||||
|
||||
- `SESSION_SECRET_KEY = "your-session-secret-key"` – a published default secret.
|
||||
- `ACCESS_TOKEN_EXPIRE_MINUTES = 480` (8 h) – excessive token lifetime increases risk if a token is leaked.
|
||||
- Wide-open `CORS_ORIGINS` allowing any localhost ports.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Make `SESSION_SECRET_KEY` **mandatory** via env var with no default. Fail hard if missing.
|
||||
- Reduce default `ACCESS_TOKEN_EXPIRE_MINUTES` to ≤60 and allow override via env.
|
||||
- In production mode (`settings.is_production`), restrict CORS origins to the canonical frontend domain list only.
|
||||
|
||||
---
|
||||
|
||||
### 12. Missing Global Rate Limiting / Brute-Force Protection
|
||||
|
||||
**Severity: MEDIUM**
|
||||
|
||||
**Problem:**
|
||||
The API exposes authentication (JWT login) and sensitive endpoints without any rate-limiting or brute-force protection, leaving it vulnerable to password-guessing and scraping attacks.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Integrate a middleware-level rate limiter (e.g., `slowapi`, `fastapi-limiter`) backed by Redis.
|
||||
- Apply stricter per-IP limits on `/auth/jwt/login`, OTP verification, and password-reset endpoints.
|
||||
|
||||
---
|
||||
|
||||
### 13. Recurring Expense Job Never Commits Transactions
|
||||
|
||||
**Severity: HIGH**
|
||||
|
||||
**Problem:**
|
||||
`be/app/jobs/recurring_expenses.py` uses `db.flush()` inside `_generate_next_occurrence` but **never calls `commit()`** on the `AsyncSession`. Because the helper is executed inside the scheduler wrapper `run_recurring_expenses_job`, the surrounding context manager closes the session without an explicit commit, so Postgres rolls the transaction back. No newly-generated expenses are permanently saved and the `max_occurrences` counter is not decremented. At the next scheduler tick the exact same expenses are re-generated, flooding the database and the UI with duplicates.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
1. Replace the manual `flush()` calls with a **single transactional block** (e.g. `async with db.begin(): …`) and call `await db.commit()` once per expense or after the batch loop.
|
||||
2. Add an integration test that verifies a generated expense actually persists after the job finishes.
|
||||
3. Consider running the scheduler job inside a **database advisory lock** to avoid multiple workers generating duplicates concurrently.
|
||||
|
||||
---
|
||||
|
||||
### 14. Blocking, Mixed-Sync Scheduler Configuration Can Starve the Event Loop
|
||||
|
||||
**Severity: MEDIUM**
|
||||
|
||||
**Problem:**
|
||||
`be/app/core/scheduler.py` converts the async database URL to a _sync_ URL (`postgresql://`) for `SQLAlchemyJobStore` **and** registers the job with `ThreadPoolExecutor`. This combination forces a blocking, synchronous DB driver (psycopg2) to execute inside FastAPI's event loop thread pool. Under load, long-running jobs will consume the limited thread pool, delay HTTP requests, and may exhaust the sync connection pool, causing 5xx errors throughout the API layer.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Use `AsyncIOJobStore` with the existing `postgresql+asyncpg` URL, or move scheduled tasks to a dedicated worker (e.g. Celery, RQ) outside the API process.
|
||||
- If synchronous execution is unavoidable, enlarge the job executor pool and _run the scheduler in a separate process_ so it cannot impact request latency.
|
||||
|
||||
---
|
||||
|
||||
### 15. Public Root & Health Endpoints Leak Deployment Metadata
|
||||
|
||||
**Severity: MEDIUM**
|
||||
|
||||
**Problem:**
|
||||
Unauthenticated `GET /` and `GET /health` responses include the `environment` ("development", "staging", "production") and full semantic `version` string straight from `app/config.py`. Attackers can fingerprint deployment targets and time their exploits around release cycles.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Limit environment & version details to authenticated admin users or internal monitoring networks.
|
||||
- Replace the public response with a minimal `{ "status": "ok" }` payload.
|
||||
- Add a separate, authenticated diagnostics endpoint for detailed build info.
|
||||
|
||||
---
|
||||
|
||||
### 16. Wildcard-Style CORS With `allow_credentials=True` Enables Session Hijacking
|
||||
|
||||
**Severity: HIGH**
|
||||
|
||||
**Problem:**
|
||||
`be/app/main.py` registers `CORSMiddleware` with the entire `settings.cors_origins_list` **and** sets `allow_credentials=True`, which instructs browsers to _send cookies and HTTP-Only session tokens_ to _any_ origin in that list. Because the default list contains every localhost port and is hard-coded, a malicious site running on another local port can silently read or mutate a user's data via XHR once the victim logs in to the real front-end.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
1. In production, populate `CORS_ORIGINS` from env and restrict it to the canonical front-end domain(s).
|
||||
2. Set `allow_credentials=False` unless cookie-based auth is absolutely required—JWTs are already used for API auth.
|
||||
3. Use the `strict-origin-when-cross-origin` referrer policy and enable CSRF protection if cookies stay enabled.
|
||||
|
||||
---
|
||||
|
||||
### 17. Groups & Settlements Lack Optimistic-Locking Enforcement
|
||||
|
||||
**Severity: MEDIUM**
|
||||
|
||||
**Problem:**
|
||||
The `groups`, `settlements`, and `settlement_activities` tables include a `version` column, but **none of the corresponding endpoints check or increment that version** during `PUT`/`DELETE` operations (unlike `lists` and `items`). Concurrent updates therefore overwrite each other silently, leading to lost data and inconsistent financial history.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Mirror the pattern used in the Lists API: require clients to supply the expected `version` and return **409 Conflict** on mismatch.
|
||||
- Increment `version` in SQL via `UPDATE … SET version = version + 1` inside the same transaction to guarantee atomicity.
|
||||
|
||||
---
|
||||
|
||||
### 18. Redis Connection Pool Never Closed
|
||||
|
||||
**Severity: LOW**
|
||||
|
||||
**Problem:**
|
||||
`be/app/core/redis.py` creates a module-level `redis_pool` but the pool is **never closed** on application shutdown. In long-running worker processes (e.g. during Kubernetes rolling updates) this leaks file descriptors and keeps TCP connections open until the OS forcibly times them out.
|
||||
|
||||
**Recommendation:**
|
||||
|
||||
- Add a lifespan handler or shutdown event to call `await redis_pool.aclose()`.
|
||||
- Use a factory in the DI system to create and dispose Redis connections per request where feasible.
|
||||
|
||||
---
|
107
docs/chore-system.md
Normal file
107
docs/chore-system.md
Normal file
@ -0,0 +1,107 @@
|
||||
# Chore System Documentation
|
||||
|
||||
## 1. Overview
|
||||
|
||||
The chore system is designed to help users manage tasks, both personal and within groups. It supports recurring chores, assignments, and a detailed history of all changes. This document provides a comprehensive overview of the system's architecture, data models, API endpoints, and frontend implementation.
|
||||
|
||||
## 2. Data Models
|
||||
|
||||
The chore system is built around three core data models: `Chore`, `ChoreAssignment`, and `ChoreHistory`.
|
||||
|
||||
### 2.1. Chore Model
|
||||
|
||||
The `Chore` model represents a task to be completed. It can be a personal chore or a group chore.
|
||||
|
||||
- **`id`**: The unique identifier for the chore.
|
||||
- **`name`**: The name of the chore.
|
||||
- **`description`**: A detailed description of the chore.
|
||||
- **`type`**: The type of chore, either `personal` or `group`.
|
||||
- **`group_id`**: The ID of the group the chore belongs to (if it's a group chore).
|
||||
- **`created_by_id`**: The ID of the user who created the chore.
|
||||
- **`frequency`**: The frequency of the chore, such as `daily`, `weekly`, or `monthly`.
|
||||
- **`custom_interval_days`**: The number of days between occurrences for custom frequency chores.
|
||||
- **`next_due_date`**: The next due date for the chore.
|
||||
- **`last_completed_at`**: The timestamp of when the chore was last completed.
|
||||
|
||||
### 2.2. ChoreAssignment Model
|
||||
|
||||
The `ChoreAssignment` model represents the assignment of a chore to a user.
|
||||
|
||||
- **`id`**: The unique identifier for the assignment.
|
||||
- **`chore_id`**: The ID of the chore being assigned.
|
||||
- **`assigned_to_user_id`**: The ID of the user the chore is assigned to.
|
||||
- **`due_date`**: The due date for the assignment.
|
||||
- **`is_complete`**: A boolean indicating whether the assignment is complete.
|
||||
- **`completed_at`**: The timestamp of when the assignment was completed.
|
||||
|
||||
### 2.3. ChoreHistory Model
|
||||
|
||||
The `ChoreHistory` model tracks all changes to a chore, such as creation, updates, and completion.
|
||||
|
||||
- **`id`**: The unique identifier for the history entry.
|
||||
- **`chore_id`**: The ID of the chore the history entry belongs to.
|
||||
- **`event_type`**: The type of event, such as `created`, `updated`, or `completed`.
|
||||
- **`event_data`**: A JSON object containing details about the event.
|
||||
- **`changed_by_user_id`**: The ID of the user who made the change.
|
||||
|
||||
## 3. API Endpoints
|
||||
|
||||
The chore system exposes a set of API endpoints for managing chores, assignments, and history.
|
||||
|
||||
### 3.1. Chores
|
||||
|
||||
- **`GET /api/v1/chores/all`**: Retrieves all chores for the current user.
|
||||
- **`POST /api/v1/chores/personal`**: Creates a new personal chore.
|
||||
- **`GET /api/v1/chores/personal`**: Retrieves all personal chores for the current user.
|
||||
- **`PUT /api/v1/chores/personal/{chore_id}`**: Updates a personal chore.
|
||||
- **`DELETE /api/v1/chores/personal/{chore_id}`**: Deletes a personal chore.
|
||||
- **`POST /api/v1/chores/groups/{group_id}/chores`**: Creates a new group chore.
|
||||
- **`GET /api/v1/chores/groups/{group_id}/chores`**: Retrieves all chores for a specific group.
|
||||
- **`PUT /api/v1/chores/groups/{group_id}/chores/{chore_id}`**: Updates a group chore.
|
||||
- **`DELETE /api/v1/chores/groups/{group_id}/chores/{chore_id}`**: Deletes a group chore.
|
||||
|
||||
### 3.2. Assignments
|
||||
|
||||
- **`POST /api/v1/chores/assignments`**: Creates a new chore assignment.
|
||||
- **`GET /api/v1/chores/assignments/my`**: Retrieves all chore assignments for the current user.
|
||||
- **`GET /api/v1/chores/chores/{chore_id}/assignments`**: Retrieves all assignments for a specific chore.
|
||||
- **`PUT /api/v1/chores/assignments/{assignment_id}`**: Updates a chore assignment.
|
||||
- **`DELETE /api/v1/chores/assignments/{assignment_id}`**: Deletes a chore assignment.
|
||||
- **`PATCH /api/v1/chores/assignments/{assignment_id}/complete`**: Marks a chore assignment as complete.
|
||||
|
||||
### 3.3. History
|
||||
|
||||
- **`GET /api/v1/chores/{chore_id}/history`**: Retrieves the history for a specific chore.
|
||||
- **`GET /api/v1/chores/assignments/{assignment_id}/history`**: Retrieves the history for a specific chore assignment.
|
||||
|
||||
## 4. Frontend Implementation
|
||||
|
||||
The frontend for the chore system is built using Vue.js and the Composition API. The main component is `ChoresPage.vue`, which is responsible for displaying the list of chores, handling user interactions, and communicating with the backend.
|
||||
|
||||
### 4.1. State Management
|
||||
|
||||
The `ChoresPage.vue` component uses Pinia for state management. The `useChoreStore` manages the state of the chores, including the list of chores, the current chore being edited, and the loading state.
|
||||
|
||||
### 4.2. User Experience
|
||||
|
||||
The user experience is designed to be intuitive and efficient. Users can easily view their chores, mark them as complete, and create new chores. The UI also provides feedback to the user, such as loading indicators and success messages.
|
||||
|
||||
## 5. Reliability and UX Recommendations
|
||||
|
||||
To improve the reliability and user experience of the chore system, the following recommendations are suggested:
|
||||
|
||||
### 5.1. Optimistic UI Updates
|
||||
|
||||
When a user marks a chore as complete, the UI should immediately reflect the change, even before the API call has completed. This will make the UI feel more responsive and reduce perceived latency.
|
||||
|
||||
### 5.2. More Robust Error Handling
|
||||
|
||||
The frontend should provide more specific and helpful error messages to the user. For example, if an API call fails, the UI should display a message that explains what went wrong and what the user can do to fix it.
|
||||
|
||||
### 5.3. Intuitive User Interface
|
||||
|
||||
The user interface could be improved to make it more intuitive and easier to use. For example, the chore creation form could be simplified, and the chore list could be made more scannable.
|
||||
|
||||
### 5.4. Improved Caching
|
||||
|
||||
The frontend should implement a more sophisticated caching strategy to reduce the number of API calls and improve performance. For example, the list of chores could be cached in local storage, and the cache could be invalidated when a new chore is created or an existing chore is updated.
|
@ -1,368 +0,0 @@
|
||||
# Expense System Documentation
|
||||
|
||||
## Overview
|
||||
|
||||
The expense system is a core feature that allows users to track shared expenses, split them among group members, and manage settlements. The system supports various split types and integrates with lists, groups, and items.
|
||||
|
||||
## Core Components
|
||||
|
||||
### 1. Expenses
|
||||
|
||||
An expense represents a shared cost that needs to be split among multiple users.
|
||||
|
||||
#### Key Properties
|
||||
|
||||
- `id`: Unique identifier
|
||||
- `description`: Description of the expense
|
||||
- `total_amount`: Total cost of the expense (Decimal)
|
||||
- `currency`: Currency code (defaults to "USD")
|
||||
- `expense_date`: When the expense occurred
|
||||
- `split_type`: How the expense should be divided
|
||||
- `list_id`: Optional reference to a shopping list
|
||||
- `group_id`: Optional reference to a group
|
||||
- `item_id`: Optional reference to a specific item
|
||||
- `paid_by_user_id`: User who paid for the expense
|
||||
- `created_by_user_id`: User who created the expense record
|
||||
- `version`: For optimistic locking
|
||||
- `overall_settlement_status`: Overall payment status
|
||||
|
||||
#### Status Types
|
||||
|
||||
```typescript
|
||||
enum ExpenseOverallStatusEnum {
|
||||
UNPAID = "unpaid",
|
||||
PARTIALLY_PAID = "partially_paid",
|
||||
PAID = "paid",
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Expense Splits
|
||||
|
||||
Splits represent how an expense is divided among users.
|
||||
|
||||
#### Key Properties
|
||||
|
||||
- `id`: Unique identifier
|
||||
- `expense_id`: Reference to parent expense
|
||||
- `user_id`: User who owes this portion
|
||||
- `owed_amount`: Amount owed by the user
|
||||
- `share_percentage`: Percentage share (for percentage-based splits)
|
||||
- `share_units`: Number of shares (for share-based splits)
|
||||
- `status`: Current payment status
|
||||
- `paid_at`: When the split was paid
|
||||
- `settlement_activities`: List of payment activities
|
||||
|
||||
#### Status Types
|
||||
|
||||
```typescript
|
||||
enum ExpenseSplitStatusEnum {
|
||||
UNPAID = "unpaid",
|
||||
PARTIALLY_PAID = "partially_paid",
|
||||
PAID = "paid",
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Settlement Activities
|
||||
|
||||
Settlement activities track individual payments made towards expense splits.
|
||||
|
||||
#### Key Properties
|
||||
|
||||
- `id`: Unique identifier
|
||||
- `expense_split_id`: Reference to the split being paid
|
||||
- `paid_by_user_id`: User making the payment
|
||||
- `amount_paid`: Amount being paid
|
||||
- `paid_at`: When the payment was made
|
||||
- `created_by_user_id`: User who recorded the payment
|
||||
|
||||
## Split Types
|
||||
|
||||
The system supports multiple ways to split expenses:
|
||||
|
||||
### 1. Equal Split
|
||||
|
||||
- Divides the total amount equally among all participants
|
||||
- Handles rounding differences by adding remainder to first split
|
||||
- No additional data required
|
||||
|
||||
### 2. Exact Amounts
|
||||
|
||||
- Users specify exact amounts for each person
|
||||
- Sum of amounts must equal total expense
|
||||
- Requires `splits_in` data with exact amounts
|
||||
|
||||
### 3. Percentage Based
|
||||
|
||||
- Users specify percentage shares
|
||||
- Percentages must sum to 100%
|
||||
- Requires `splits_in` data with percentages
|
||||
|
||||
### 4. Share Based
|
||||
|
||||
- Users specify number of shares
|
||||
- Amount divided proportionally to shares
|
||||
- Requires `splits_in` data with share units
|
||||
|
||||
### 5. Item Based
|
||||
|
||||
- Splits based on items in a shopping list
|
||||
- Each item's cost is assigned to its adder
|
||||
- Requires `list_id` and optionally `item_id`
|
||||
|
||||
## Integration Points
|
||||
|
||||
### 1. Lists
|
||||
|
||||
- Expenses can be associated with shopping lists
|
||||
- Item-based splits use list items to determine splits
|
||||
- List's group context can determine split participants
|
||||
|
||||
### 2. Groups
|
||||
|
||||
- Expenses can be directly associated with groups
|
||||
- Group membership determines who can be included in splits
|
||||
- Group context is required if no list is specified
|
||||
|
||||
### 3. Items
|
||||
|
||||
- Expenses can be linked to specific items
|
||||
- Item prices are used for item-based splits
|
||||
- Items must belong to a list
|
||||
|
||||
### 4. Users
|
||||
|
||||
- Users can be payers, debtors, or payment recorders
|
||||
- User relationships are tracked in splits and settlements
|
||||
- User context is required for all financial operations
|
||||
|
||||
## Key Operations
|
||||
|
||||
### 1. Creating Expenses
|
||||
|
||||
1. Validate context (list/group)
|
||||
2. Create expense record
|
||||
3. Generate splits based on split type
|
||||
4. Validate total amounts match
|
||||
5. Save all records in transaction
|
||||
|
||||
### 2. Updating Expenses
|
||||
|
||||
- Limited to non-financial fields:
|
||||
- Description
|
||||
- Currency
|
||||
- Expense date
|
||||
- Uses optimistic locking via version field
|
||||
- Cannot modify splits after creation
|
||||
|
||||
### 3. Recording Payments
|
||||
|
||||
1. Create settlement activity
|
||||
2. Update split status
|
||||
3. Recalculate expense overall status
|
||||
4. All operations in single transaction
|
||||
|
||||
### 4. Deleting Expenses
|
||||
|
||||
- Requires version matching
|
||||
- Cascades to splits and settlements
|
||||
- All operations in single transaction
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Data Integrity**
|
||||
|
||||
- Always use transactions for multi-step operations
|
||||
- Validate totals match before saving
|
||||
- Use optimistic locking for updates
|
||||
|
||||
2. **Error Handling**
|
||||
|
||||
- Handle database errors appropriately
|
||||
- Validate user permissions
|
||||
- Check for concurrent modifications
|
||||
|
||||
3. **Performance**
|
||||
|
||||
- Use appropriate indexes
|
||||
- Load relationships efficiently
|
||||
- Batch operations when possible
|
||||
|
||||
4. **Security**
|
||||
- Validate user permissions
|
||||
- Sanitize input data
|
||||
- Use proper access controls
|
||||
|
||||
## Common Use Cases
|
||||
|
||||
1. **Group Dinner**
|
||||
|
||||
- Create expense with total amount
|
||||
- Use equal split or exact amounts
|
||||
- Record payments as they occur
|
||||
|
||||
2. **Shopping List**
|
||||
|
||||
- Create item-based expense
|
||||
- System automatically splits based on items
|
||||
- Track payments per person
|
||||
|
||||
3. **Rent Sharing**
|
||||
|
||||
- Create expense with total rent
|
||||
- Use percentage or share-based split
|
||||
- Record monthly payments
|
||||
|
||||
4. **Trip Expenses**
|
||||
- Create multiple expenses
|
||||
- Mix different split types
|
||||
- Track overall balances
|
||||
|
||||
## Recurring Expenses
|
||||
|
||||
Recurring expenses are expenses that repeat at regular intervals. They are useful for regular payments like rent, utilities, or subscription services.
|
||||
|
||||
### Recurrence Types
|
||||
|
||||
1. **Daily**
|
||||
|
||||
- Repeats every X days
|
||||
- Example: Daily parking fee
|
||||
|
||||
2. **Weekly**
|
||||
|
||||
- Repeats every X weeks on specific days
|
||||
- Example: Weekly cleaning service
|
||||
|
||||
3. **Monthly**
|
||||
|
||||
- Repeats every X months on the same date
|
||||
- Example: Monthly rent payment
|
||||
|
||||
4. **Yearly**
|
||||
- Repeats every X years on the same date
|
||||
- Example: Annual insurance premium
|
||||
|
||||
### Implementation Details
|
||||
|
||||
1. **Recurrence Pattern**
|
||||
|
||||
```typescript
|
||||
interface RecurrencePattern {
|
||||
type: "daily" | "weekly" | "monthly" | "yearly";
|
||||
interval: number; // Every X days/weeks/months/years
|
||||
daysOfWeek?: number[]; // For weekly recurrence (0-6, Sunday-Saturday)
|
||||
endDate?: string; // Optional end date for the recurrence
|
||||
maxOccurrences?: number; // Optional maximum number of occurrences
|
||||
}
|
||||
```
|
||||
|
||||
2. **Recurring Expense Properties**
|
||||
|
||||
- All standard expense properties
|
||||
- `recurrence_pattern`: Defines how the expense repeats
|
||||
- `next_occurrence`: When the next expense will be created
|
||||
- `last_occurrence`: When the last expense was created
|
||||
- `is_recurring`: Boolean flag to identify recurring expenses
|
||||
|
||||
3. **Generation Process**
|
||||
|
||||
- System automatically creates new expenses based on the pattern
|
||||
- Each generated expense is a regular expense with its own splits
|
||||
- Original recurring expense serves as a template
|
||||
- Generated expenses can be modified individually
|
||||
|
||||
4. **Management Features**
|
||||
- Pause/resume recurrence
|
||||
- Modify future occurrences
|
||||
- Skip specific occurrences
|
||||
- End recurrence early
|
||||
- View all generated expenses
|
||||
|
||||
### Best Practices for Recurring Expenses
|
||||
|
||||
1. **Data Management**
|
||||
|
||||
- Keep original recurring expense as template
|
||||
- Generate new expenses in advance
|
||||
- Clean up old generated expenses periodically
|
||||
|
||||
2. **User Experience**
|
||||
|
||||
- Clear indication of recurring expenses
|
||||
- Easy way to modify future occurrences
|
||||
- Option to handle exceptions
|
||||
|
||||
3. **Performance**
|
||||
- Batch process expense generation
|
||||
- Index recurring expense queries
|
||||
- Cache frequently accessed patterns
|
||||
|
||||
### Example Use Cases
|
||||
|
||||
1. **Monthly Rent**
|
||||
|
||||
```json
|
||||
{
|
||||
"description": "Monthly Rent",
|
||||
"total_amount": "2000.00",
|
||||
"split_type": "PERCENTAGE",
|
||||
"recurrence_pattern": {
|
||||
"type": "monthly",
|
||||
"interval": 1,
|
||||
"endDate": "2024-12-31"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
2. **Weekly Cleaning Service**
|
||||
```json
|
||||
{
|
||||
"description": "Weekly Cleaning",
|
||||
"total_amount": "150.00",
|
||||
"split_type": "EQUAL",
|
||||
"recurrence_pattern": {
|
||||
"type": "weekly",
|
||||
"interval": 1,
|
||||
"daysOfWeek": [1] // Every Monday
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## API Considerations
|
||||
|
||||
1. **Decimal Handling**
|
||||
|
||||
- Use string representation for decimals in API
|
||||
- Convert to Decimal for calculations
|
||||
- Round to 2 decimal places for money
|
||||
|
||||
2. **Date Handling**
|
||||
|
||||
- Use ISO format for dates
|
||||
- Store in UTC
|
||||
- Convert to local time for display
|
||||
|
||||
3. **Status Updates**
|
||||
- Update split status on payment
|
||||
- Recalculate overall status
|
||||
- Notify relevant users
|
||||
|
||||
## Future Considerations
|
||||
|
||||
1. **Potential Enhancements**
|
||||
|
||||
- Recurring expenses
|
||||
- Bulk operations
|
||||
- Advanced reporting
|
||||
- Currency conversion
|
||||
|
||||
2. **Scalability**
|
||||
|
||||
- Handle large groups
|
||||
- Optimize for frequent updates
|
||||
- Consider caching strategies
|
||||
|
||||
3. **Integration**
|
||||
- Payment providers
|
||||
- Accounting systems
|
||||
- Export capabilities
|
270
docs/financial-system-overview.md
Normal file
270
docs/financial-system-overview.md
Normal file
@ -0,0 +1,270 @@
|
||||
# Financial System Overview
|
||||
|
||||
## Introduction
|
||||
|
||||
This document provides a comprehensive overview of the **Expense, Cost & Financial** domain of the project. It is intended for backend / frontend developers, QA engineers and DevOps personnel who need to understand **how money flows through the system**, what invariants are enforced and where to extend the platform.
|
||||
|
||||
> **TL;DR** The financial subsystem is a *Split-wise*-inspired engine with first-class support for: shared lists, item-derived expenses, multi-scheme splitting, settlements, recurring charges and complete auditability.
|
||||
|
||||
---
|
||||
|
||||
## Main Concepts & Entities
|
||||
|
||||
| Entity | Purpose | Key Relationships |
|
||||
|--------|---------|-------------------|
|
||||
| **User** | A registered account. Owns expenses, owes splits, records settlements. | `expenses_paid`, `expenses_created`, `expense_splits`, `settlements_made/received`, `settlements_created` |
|
||||
| **Group** | A collection of users with shared expenses / lists. | `member_associations (UserGroup)`, `lists`, `expenses`, `settlements` |
|
||||
| **List** | A shopping / to-do list. May belong to a `Group` or be personal. | `items`, `expenses` |
|
||||
| **Item** | A purchasable line inside a `List`. Price & author drive *item-based* expense splits. | `list`, `added_by_user` |
|
||||
| **Expense** | A monetary outflow. | `splits`, `list`, `group`, `item`, `recurrence_pattern` |
|
||||
| **ExpenseSplit** | A *who-owes-what* record for an `Expense`. | `user`, `settlement_activities` |
|
||||
| **Settlement** | A generic *cash transfer* between two users inside a group. | – |
|
||||
| **SettlementActivity** | A *payment* that reduces an individual `ExpenseSplit` (e.g. Alice pays Bob her part). | – |
|
||||
| **RecurrencePattern** | The schedule template that spawns future `Expense` occurrences. | `expenses` |
|
||||
| **FinancialAuditLog** | Append-only journal of *who did what* (create / update / delete) for all financial entities. | – |
|
||||
|
||||
---
|
||||
|
||||
## Expense Lifecyle
|
||||
|
||||
1. **Creation** (`POST /financials/expenses`)
|
||||
• Caller provides an `ExpenseCreate` DTO.<br />
|
||||
• Backend validates the *context* (list / group / item), the *payer*, and the chosen **split strategy**.
|
||||
• Supported `split_type` values (`SplitTypeEnum`):
|
||||
* `EQUAL` – evenly divided among computed participants.
|
||||
* `EXACT_AMOUNTS` – caller supplies absolute owed amounts.
|
||||
* `PERCENTAGE` – caller supplies percentages totalling 100 %.
|
||||
* `SHARES` – integer share units (e.g. 1 : 2 : 3).
|
||||
* `ITEM_BASED` – derived from priced `Item`s in a list.
|
||||
• A database transaction writes `Expense` + `ExpenseSplit` rows and a `FinancialAuditLog` entry.
|
||||
|
||||
2. **Reading**
|
||||
• `GET /financials/expenses/{id}` enforces *row-level* security: the requester must be payer, list member or group member.
|
||||
|
||||
3. **Update / Delete**
|
||||
• Optimistic-locking via the `version` field.
|
||||
• Only the **payer** or a **group owner** may mutate records.
|
||||
|
||||
4. **Settlement**
|
||||
• Generic settlements (`/financials/settlements`) clear balances between two users.<br />
|
||||
• Fine-grained settlements (`/financials/expense_splits/{id}/settle`) clear a single `ExpenseSplit`.
|
||||
|
||||
5. **Recurring Expenses**
|
||||
• An `Expense` can be flagged `is_recurring = true` and carry a `RecurrencePattern`.<br />
|
||||
• A background job (`app/jobs/recurring_expenses.py::generate_recurring_expenses`) wakes up daily and:
|
||||
1. Finds template expenses due (`next_occurrence <= now`).
|
||||
2. Spawns a *child* expense replicating the split logic.
|
||||
3. Updates `last_occurrence`, decrements `max_occurrences` and calculates the next date.
|
||||
|
||||
---
|
||||
|
||||
## Cost Summaries & Balance Sheets
|
||||
|
||||
### List Cost Summary
|
||||
Endpoint: `GET /costs/lists/{list_id}/cost-summary`
|
||||
|
||||
• If an `ITEM_BASED` expense already exists → returns a snapshot derived from the **expense**.<br />
|
||||
• Otherwise → computes an *on-the-fly* summary using `Item.price` values (read-only).
|
||||
|
||||
Key Figures:
|
||||
* `total_list_cost` – sum of item prices.
|
||||
* `equal_share_per_user` – what each participant *should* pay.
|
||||
* `balance` – (over / under) contribution for each user.
|
||||
|
||||
*Action:* `POST /costs/lists/{id}/cost-summary` finalises the list by persisting the derived `ITEM_BASED` expense.
|
||||
|
||||
### Group Balance Summary
|
||||
Endpoint: `GET /costs/groups/{group_id}/balance-summary`
|
||||
|
||||
Aggregates across **all** group expenses + settlements:
|
||||
* What each user paid (expenses + received settlements)
|
||||
* What each user owed
|
||||
* Suggested minimal settlement graph (creditors vs debtors)
|
||||
|
||||
---
|
||||
|
||||
## Data-Integrity Rules & Guards
|
||||
|
||||
1. `Expense` **must** reference *either* `list_id` **or** `group_id` (DB-level CHECK).
|
||||
2. Row uniqueness guards: `UniqueConstraint('expense_id', 'user_id')` on `ExpenseSplit`.
|
||||
3. `Settlement` payer ≠ payee (DB CHECK).
|
||||
4. All mutating endpoints perform **authorization** checks via `crud_group` / `crud_list` helpers.
|
||||
5. Monetary amounts are stored as `Numeric(10,2)` and rounded (`ROUND_HALF_UP`).
|
||||
|
||||
---
|
||||
|
||||
## Recent Fixes & Improvements (June 2025)
|
||||
|
||||
| Area | Issue | Resolution |
|
||||
|------|-------|------------|
|
||||
| **Recurring filter** | `GET /financials/expenses?isRecurring=true` referenced nonexistent `recurrence_rule`. | Switched to `is_recurring` flag. |
|
||||
| **Enum mismatches** | `RecurrencePattern.type` stored uppercase enum, while API took lowercase strings. | Robust mapper converts strings → `RecurrenceTypeEnum`; scheduler is now case-insensitive. |
|
||||
| **Scheduler** | `_calculate_next_occurrence` failed with Enum values & stringified `days_of_week`. | Added polymorphic handling + safe parsing of comma-separated strings. |
|
||||
|
||||
*All tests pass (`pytest -q`) and new unit tests cover the edge-cases above.*
|
||||
|
||||
---
|
||||
|
||||
## Extension Points
|
||||
|
||||
* **VAT / Tax logic** – attach a `tax_rate` column to `Expense` and resolve net vs gross amounts.
|
||||
* **Multi-currency** – normalize amounts to a base currency using FX rates; expose a `Currency` table.
|
||||
* **Budgeting / Limits** – per-group or per-list spending caps with alerting.
|
||||
* **Webhook notifications** – publish `FinancialAuditLog` entries to external services.
|
||||
|
||||
---
|
||||
|
||||
## Appendix – Key SQL Schema Snapshots
|
||||
|
||||
```sql
|
||||
-- Expense table (excerpt)
|
||||
CREATE TABLE expenses (
|
||||
id SERIAL PRIMARY KEY,
|
||||
total_amount NUMERIC(10,2) NOT NULL,
|
||||
split_type VARCHAR(20) NOT NULL,
|
||||
list_id INT NULL,
|
||||
group_id INT NULL,
|
||||
-- …
|
||||
CHECK (group_id IS NOT NULL OR list_id IS NOT NULL)
|
||||
);
|
||||
|
||||
CREATE UNIQUE INDEX uq_expense_user_split ON expense_splits(expense_id, user_id);
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## System Reliability Analysis & Improvements
|
||||
|
||||
### ✅ Implemented Reliability Features
|
||||
|
||||
#### 1. **Transaction Safety**
|
||||
- All financial operations use **transactional sessions** (`get_transactional_session`)
|
||||
- Atomic operations ensure data consistency across expense creation, splits, and settlements
|
||||
- Row-level locking (`WITH FOR UPDATE`) prevents race conditions in settlement activities
|
||||
|
||||
#### 2. **Overpayment Protection**
|
||||
- **NEW**: Settlement activities now validate against remaining owed amount
|
||||
- Prevents payments that exceed the split's owed amount
|
||||
- Provides clear error messages with remaining balance information
|
||||
- Handles multiple partial payments correctly
|
||||
|
||||
#### 3. **Data Validation & Constraints**
|
||||
- **Decimal precision**: All monetary amounts use `Numeric(10,2)` with proper rounding
|
||||
- **Positive amount validation**: Prevents negative payments and settlements
|
||||
- **User existence validation**: Ensures all referenced users exist before operations
|
||||
- **Split consistency**: Validates split totals match expense amounts (EXACT_AMOUNTS, PERCENTAGE)
|
||||
|
||||
#### 4. **Permission & Authorization**
|
||||
- Multi-layered permission checks for expense creation and settlement recording
|
||||
- Group owners can act on behalf of members with proper validation
|
||||
- List/group access controls prevent unauthorized financial operations
|
||||
|
||||
#### 5. **Status Management**
|
||||
- Automatic status updates for expense splits (unpaid → partially_paid → paid)
|
||||
- Cascading status updates for parent expenses based on split states
|
||||
- Pessimistic locking ensures consistent status transitions
|
||||
|
||||
#### 6. **Audit Trail**
|
||||
- All financial operations logged via `create_financial_audit_log`
|
||||
- Complete traceability of who created/modified financial records
|
||||
- Immutable settlement activity records (no updates, only creation)
|
||||
|
||||
#### 7. **Error Handling**
|
||||
- Comprehensive exception hierarchy for different error types
|
||||
- Specific `OverpaymentError` for payment validation failures
|
||||
- Database integrity and connection error handling
|
||||
- Graceful degradation with meaningful error messages
|
||||
|
||||
### 🔍 Potential Areas for Enhancement
|
||||
|
||||
#### 1. **Optimistic Locking for Expenses**
|
||||
Currently, expenses use basic versioning but could benefit from full optimistic locking:
|
||||
```python
|
||||
# Consider adding to ExpenseUpdate operations
|
||||
if expected_version != current_expense.version:
|
||||
raise ConflictError("Expense was modified by another user")
|
||||
```
|
||||
|
||||
#### 2. **Recurring Expense Reliability**
|
||||
- Add retry logic for failed recurring expense generation
|
||||
- Implement dead letter queue for failed recurring operations
|
||||
- Add monitoring for recurring expense job health
|
||||
|
||||
#### 3. **Currency Consistency**
|
||||
While currency is stored, there's no validation that all splits in an expense use the same currency:
|
||||
```python
|
||||
# Potential enhancement
|
||||
if expense.currency != "USD" and any(split.currency != expense.currency for split in splits):
|
||||
raise InvalidOperationError("All splits must use the same currency as the parent expense")
|
||||
```
|
||||
|
||||
#### 4. **Settlement Verification**
|
||||
Consider adding verification flags for settlements to distinguish between:
|
||||
- Automatic settlements (from expense splits)
|
||||
- Manual settlements (direct user payments)
|
||||
- Disputed settlements requiring verification
|
||||
|
||||
### 📊 System Health Metrics
|
||||
|
||||
The system provides comprehensive financial tracking through:
|
||||
|
||||
1. **Real-time balance calculations** via `costs_service.py`
|
||||
2. **Settlement suggestions** using optimal debt reduction algorithms
|
||||
3. **Expense categorization and filtering** with proper indexing
|
||||
4. **Multi-context support** (lists, groups, personal expenses)
|
||||
|
||||
### 🛡️ Security Considerations
|
||||
|
||||
- **Input sanitization**: All financial inputs validated through Pydantic schemas
|
||||
- **Authorization layers**: Multiple permission checks prevent unauthorized access
|
||||
- **Audit logging**: Complete financial operation history for compliance
|
||||
- **Data isolation**: Users only see expenses/settlements they have permission to access
|
||||
|
||||
### 🚀 Performance Optimizations
|
||||
|
||||
- **Database indexing** on critical foreign keys and search fields
|
||||
- **Eager loading** with `selectinload` to prevent N+1 queries
|
||||
- **Pagination** for large result sets
|
||||
- **Connection pooling** with health checks (`pool_pre_ping=True`)
|
||||
|
||||
---
|
||||
|
||||
## Testing & Quality Assurance
|
||||
|
||||
The financial system includes comprehensive test coverage:
|
||||
|
||||
1. **Unit tests** for CRUD operations and business logic
|
||||
2. **Integration tests** for API endpoints and workflows
|
||||
3. **Edge case testing** for overpayment protection and boundary conditions
|
||||
4. **Concurrency tests** for settlement race conditions
|
||||
5. **Data consistency tests** for split calculations and status updates
|
||||
|
||||
Example test scenarios:
|
||||
- Multiple users settling the same split simultaneously
|
||||
- Expense split totals validation across different split types
|
||||
- Currency precision and rounding accuracy
|
||||
- Permission boundary testing for cross-user operations
|
||||
|
||||
---
|
||||
|
||||
## Deployment & Monitoring Recommendations
|
||||
|
||||
### Database Considerations
|
||||
- Regular backup strategy for financial data
|
||||
- Monitor transaction isolation levels and deadlocks
|
||||
- Set up alerts for unusual financial activity patterns
|
||||
- Implement database connection monitoring
|
||||
|
||||
### Application Monitoring
|
||||
- Track settlement activity creation rates and failures
|
||||
- Monitor recurring expense job execution and errors
|
||||
- Set up alerts for permission denial patterns
|
||||
- Track API response times for financial endpoints
|
||||
|
||||
### Business Intelligence
|
||||
- Daily/weekly financial summaries per group
|
||||
- Settlement velocity tracking (time to pay debts)
|
||||
- Expense categorization analytics
|
||||
- User engagement with financial features
|
||||
|
||||
The financial system is now **production-ready** with robust reliability safeguards, comprehensive error handling, and strong data consistency guarantees.
|
@ -16,6 +16,10 @@ SESSION_SECRET_KEY=your_session_secret_key_here_minimum_32_characters_long
|
||||
GEMINI_API_KEY=your_gemini_api_key_here
|
||||
|
||||
# Redis Configuration
|
||||
# If you are running the Redis container from docker-compose, the connection URL is usually:
|
||||
# redis://:<password>@redis:6379/0
|
||||
# Otherwise adjust host/port/password as required.
|
||||
REDIS_URL=redis://:your_redis_password_here@redis:6379/0
|
||||
REDIS_PASSWORD=your_redis_password_here
|
||||
|
||||
# Sentry Configuration (Optional but recommended)
|
||||
@ -43,4 +47,13 @@ APPLE_REDIRECT_URI=https://yourdomain.com/auth/apple/callback
|
||||
|
||||
# Production Settings
|
||||
ENVIRONMENT=production
|
||||
|
||||
# Logging Configuration
|
||||
# Valid LOG_LEVEL values: DEBUG, INFO, WARNING, ERROR, CRITICAL
|
||||
LOG_LEVEL=INFO
|
||||
# LOG_FORMAT defaults to a timestamped pattern – override only if you have special needs.
|
||||
# LOG_FORMAT="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
|
||||
|
||||
# Auth / Security
|
||||
# By default JWT access tokens live for 60 minutes; you can shorten or extend here (in minutes).
|
||||
ACCESS_TOKEN_EXPIRE_MINUTES=60
|
29
fe/package-lock.json
generated
29
fe/package-lock.json
generated
@ -15,6 +15,7 @@
|
||||
"@vueuse/core": "^13.1.0",
|
||||
"axios": "^1.9.0",
|
||||
"date-fns": "^4.1.0",
|
||||
"framer-motion": "^12.16.0",
|
||||
"motion": "^12.15.0",
|
||||
"pinia": "^3.0.2",
|
||||
"qs": "^6.14.0",
|
||||
@ -34,6 +35,7 @@
|
||||
"@types/date-fns": "^2.5.3",
|
||||
"@types/jsdom": "^21.1.7",
|
||||
"@types/node": "^22.15.17",
|
||||
"@types/qs": "^6.14.0",
|
||||
"@vitejs/plugin-vue": "^5.2.3",
|
||||
"@vitest/eslint-plugin": "^1.1.39",
|
||||
"@vue/eslint-config-prettier": "^10.2.0",
|
||||
@ -4291,6 +4293,13 @@
|
||||
"integrity": "sha512-PIzZZlEppgrpoT2QgbnDU+MMzuR6BbCjllj0bM70lWoejMeNJAxCchxnv7J3XFkI8MpygtRpzXrIlmWUBclP5A==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@types/qs": {
|
||||
"version": "6.14.0",
|
||||
"resolved": "https://registry.npmjs.org/@types/qs/-/qs-6.14.0.tgz",
|
||||
"integrity": "sha512-eOunJqu0K1923aExK6y8p6fsihYEn/BYuQ4g0CxAAgFc4b/ZLN4CrsRZ55srTdqoiLzU2B2evC+apEIxprEzkQ==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@types/react": {
|
||||
"version": "19.1.6",
|
||||
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.1.6.tgz",
|
||||
@ -7491,12 +7500,12 @@
|
||||
}
|
||||
},
|
||||
"node_modules/framer-motion": {
|
||||
"version": "12.15.0",
|
||||
"resolved": "https://registry.npmjs.org/framer-motion/-/framer-motion-12.15.0.tgz",
|
||||
"integrity": "sha512-XKg/LnKExdLGugZrDILV7jZjI599785lDIJZLxMiiIFidCsy0a4R2ZEf+Izm67zyOuJgQYTHOmodi7igQsw3vg==",
|
||||
"version": "12.16.0",
|
||||
"resolved": "https://registry.npmjs.org/framer-motion/-/framer-motion-12.16.0.tgz",
|
||||
"integrity": "sha512-xryrmD4jSBQrS2IkMdcTmiS4aSKckbS7kLDCuhUn9110SQKG1w3zlq1RTqCblewg+ZYe+m3sdtzQA6cRwo5g8Q==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"motion-dom": "^12.15.0",
|
||||
"motion-dom": "^12.16.0",
|
||||
"motion-utils": "^12.12.1",
|
||||
"tslib": "^2.4.0"
|
||||
},
|
||||
@ -9295,9 +9304,9 @@
|
||||
}
|
||||
},
|
||||
"node_modules/motion-dom": {
|
||||
"version": "12.15.0",
|
||||
"resolved": "https://registry.npmjs.org/motion-dom/-/motion-dom-12.15.0.tgz",
|
||||
"integrity": "sha512-D2ldJgor+2vdcrDtKJw48k3OddXiZN1dDLLWrS8kiHzQdYVruh0IoTwbJBslrnTXIPgFED7PBN2Zbwl7rNqnhA==",
|
||||
"version": "12.16.0",
|
||||
"resolved": "https://registry.npmjs.org/motion-dom/-/motion-dom-12.16.0.tgz",
|
||||
"integrity": "sha512-Z2nGwWrrdH4egLEtgYMCEN4V2qQt1qxlKy/uV7w691ztyA41Q5Rbn0KNGbsNVDZr9E8PD2IOQ3hSccRnB6xWzw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"motion-utils": "^12.12.1"
|
||||
@ -11732,9 +11741,9 @@
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/tinyglobby": {
|
||||
"version": "0.2.13",
|
||||
"resolved": "https://registry.npmjs.org/tinyglobby/-/tinyglobby-0.2.13.tgz",
|
||||
"integrity": "sha512-mEwzpUgrLySlveBwEVDMKk5B57bhLPYovRfPAXD5gA/98Opn0rCDj3GtLwFvCvH5RK9uPCExUROW5NjDwvqkxw==",
|
||||
"version": "0.2.14",
|
||||
"resolved": "https://registry.npmjs.org/tinyglobby/-/tinyglobby-0.2.14.tgz",
|
||||
"integrity": "sha512-tX5e7OM1HnYr2+a2C/4V0htOcSQcoSTH9KgJnVvNm5zm/cyEWKJ7j7YutsH9CxMdtOkkLFy2AHrMci9IM8IPZQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
|
@ -26,6 +26,7 @@
|
||||
"@vueuse/core": "^13.1.0",
|
||||
"axios": "^1.9.0",
|
||||
"date-fns": "^4.1.0",
|
||||
"framer-motion": "^12.16.0",
|
||||
"motion": "^12.15.0",
|
||||
"pinia": "^3.0.2",
|
||||
"qs": "^6.14.0",
|
||||
@ -45,6 +46,7 @@
|
||||
"@types/date-fns": "^2.5.3",
|
||||
"@types/jsdom": "^21.1.7",
|
||||
"@types/node": "^22.15.17",
|
||||
"@types/qs": "^6.14.0",
|
||||
"@vitejs/plugin-vue": "^5.2.3",
|
||||
"@vitest/eslint-plugin": "^1.1.39",
|
||||
"@vue/eslint-config-prettier": "^10.2.0",
|
||||
|
@ -81,7 +81,8 @@
|
||||
body {
|
||||
font-family: 'Patrick Hand', cursive;
|
||||
background-color: var(--light);
|
||||
background-image: var(--paper-texture);
|
||||
// background-image: var(--paper-texture);
|
||||
// background-image: url('@/assets/11.webp');
|
||||
// padding: 2rem 1rem;s
|
||||
color: var(--dark);
|
||||
font-size: 1.1rem;
|
||||
@ -917,11 +918,13 @@ select.form-input {
|
||||
.modal-backdrop {
|
||||
position: fixed;
|
||||
inset: 0;
|
||||
background-color: rgba(57, 62, 70, 0.7);
|
||||
background-color: rgba(57, 62, 70, 0.9);
|
||||
/* Increased opacity for better visibility */
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
z-index: 1000;
|
||||
z-index: 9999;
|
||||
/* Increased z-index to ensure it's above other elements */
|
||||
opacity: 0;
|
||||
visibility: hidden;
|
||||
transition:
|
||||
@ -941,16 +944,18 @@ select.form-input {
|
||||
background-color: var(--light);
|
||||
border: var(--border);
|
||||
width: 90%;
|
||||
max-width: 550px;
|
||||
max-width: 850px;
|
||||
box-shadow: var(--shadow-lg);
|
||||
position: relative;
|
||||
overflow-y: scroll;
|
||||
/* Can cause tooltip clipping */
|
||||
overflow-y: auto;
|
||||
/* Changed from scroll to auto */
|
||||
transform: scale(0.95) translateY(-20px);
|
||||
transition: transform var(--transition-speed) var(--transition-ease-out);
|
||||
max-height: 90vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
z-index: 10000;
|
||||
/* Ensure modal content is above backdrop */
|
||||
}
|
||||
|
||||
.modal-container::before {
|
||||
|
47
fe/src/components/CategoryForm.vue
Normal file
47
fe/src/components/CategoryForm.vue
Normal file
@ -0,0 +1,47 @@
|
||||
<template>
|
||||
<form @submit.prevent="handleSubmit">
|
||||
<div class="form-group">
|
||||
<label for="category-name">Category Name</label>
|
||||
<input type="text" id="category-name" v-model="categoryName" required />
|
||||
</div>
|
||||
<div class="form-actions">
|
||||
<button type="submit" :disabled="loading">
|
||||
{{ isEditing ? 'Update' : 'Create' }}
|
||||
</button>
|
||||
<button type="button" @click="emit('cancel')" :disabled="loading">
|
||||
Cancel
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, defineProps, defineEmits, onMounted, computed } from 'vue';
|
||||
import type { Category } from '../stores/categoryStore';
|
||||
|
||||
const props = defineProps<{
|
||||
category?: Category | null;
|
||||
loading: boolean;
|
||||
}>();
|
||||
|
||||
const emit = defineEmits<{
|
||||
(e: 'submit', data: { name: string }): void;
|
||||
(e: 'cancel'): void;
|
||||
}>();
|
||||
|
||||
const categoryName = ref('');
|
||||
|
||||
const isEditing = computed(() => !!props.category);
|
||||
|
||||
onMounted(() => {
|
||||
if (props.category) {
|
||||
categoryName.value = props.category.name;
|
||||
}
|
||||
});
|
||||
|
||||
const handleSubmit = () => {
|
||||
if (categoryName.value.trim()) {
|
||||
emit('submit', { name: categoryName.value.trim() });
|
||||
}
|
||||
};
|
||||
</script>
|
371
fe/src/components/ChoreItem.vue
Normal file
371
fe/src/components/ChoreItem.vue
Normal file
@ -0,0 +1,371 @@
|
||||
<template>
|
||||
<li class="neo-list-item" :class="`status-${getDueDateStatus(chore)}`">
|
||||
<div class="neo-item-content">
|
||||
<label class="neo-checkbox-label">
|
||||
<input type="checkbox" :checked="chore.is_completed" @change="emit('toggle-completion', chore)">
|
||||
<div class="checkbox-content">
|
||||
<div class="chore-main-info">
|
||||
<span class="checkbox-text-span"
|
||||
:class="{ 'neo-completed-static': chore.is_completed && !chore.updating }">
|
||||
{{ chore.name }}
|
||||
</span>
|
||||
<div class="chore-badges">
|
||||
<span v-if="chore.type === 'group'" class="badge badge-group">Group</span>
|
||||
<span v-if="getDueDateStatus(chore) === 'overdue'"
|
||||
class="badge badge-overdue">Overdue</span>
|
||||
<span v-if="getDueDateStatus(chore) === 'due-today'" class="badge badge-due-today">Due
|
||||
Today</span>
|
||||
<span v-if="getDueDateStatus(chore) === 'upcoming'" class="badge badge-upcoming">{{
|
||||
dueInText }}</span>
|
||||
</div>
|
||||
</div>
|
||||
<div v-if="chore.description" class="chore-description">{{ chore.description }}</div>
|
||||
<span v-if="chore.subtext" class="item-time">{{ chore.subtext }}</span>
|
||||
<div v-if="totalTime > 0" class="total-time">
|
||||
Total Time: {{ formatDuration(totalTime) }}
|
||||
</div>
|
||||
</div>
|
||||
</label>
|
||||
<div class="neo-item-actions">
|
||||
<button class="btn btn-sm btn-neutral" @click="toggleTimer"
|
||||
:disabled="chore.is_completed || !chore.current_assignment_id">
|
||||
{{ isActiveTimer ? '⏸️' : '▶️' }}
|
||||
</button>
|
||||
<button class="btn btn-sm btn-neutral" @click="emit('open-details', chore)" title="View Details">
|
||||
📋
|
||||
</button>
|
||||
<button class="btn btn-sm btn-neutral" @click="emit('open-history', chore)" title="View History">
|
||||
📅
|
||||
</button>
|
||||
<button class="btn btn-sm btn-neutral" @click="emit('edit', chore)">
|
||||
✏️
|
||||
</button>
|
||||
<button class="btn btn-sm btn-danger" @click="emit('delete', chore)">
|
||||
🗑️
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<ul v-if="chore.child_chores && chore.child_chores.length" class="child-chore-list">
|
||||
<ChoreItem v-for="child in chore.child_chores" :key="child.id" :chore="child" :time-entries="timeEntries"
|
||||
:active-timer="activeTimer" @toggle-completion="emit('toggle-completion', $event)"
|
||||
@edit="emit('edit', $event)" @delete="emit('delete', $event)"
|
||||
@open-details="emit('open-details', $event)" @open-history="emit('open-history', $event)"
|
||||
@start-timer="emit('start-timer', $event)"
|
||||
@stop-timer="(chore, timeEntryId) => emit('stop-timer', chore, timeEntryId)" />
|
||||
</ul>
|
||||
</li>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { defineProps, defineEmits, computed } from 'vue';
|
||||
import { formatDistanceToNow, parseISO, isToday, isPast } from 'date-fns';
|
||||
import type { ChoreWithCompletion } from '../types/chore';
|
||||
import type { TimeEntry } from '../stores/timeEntryStore';
|
||||
import { formatDuration } from '../utils/formatters';
|
||||
|
||||
const props = defineProps<{
|
||||
chore: ChoreWithCompletion;
|
||||
timeEntries: TimeEntry[];
|
||||
activeTimer: TimeEntry | null;
|
||||
}>();
|
||||
|
||||
const emit = defineEmits<{
|
||||
(e: 'toggle-completion', chore: ChoreWithCompletion): void;
|
||||
(e: 'edit', chore: ChoreWithCompletion): void;
|
||||
(e: 'delete', chore: ChoreWithCompletion): void;
|
||||
(e: 'open-details', chore: ChoreWithCompletion): void;
|
||||
(e: 'open-history', chore: ChoreWithCompletion): void;
|
||||
(e: 'start-timer', chore: ChoreWithCompletion): void;
|
||||
(e: 'stop-timer', chore: ChoreWithCompletion, timeEntryId: number): void;
|
||||
}>();
|
||||
|
||||
const isActiveTimer = computed(() => {
|
||||
return props.activeTimer && props.activeTimer.chore_assignment_id === props.chore.current_assignment_id;
|
||||
});
|
||||
|
||||
const totalTime = computed(() => {
|
||||
return props.timeEntries.reduce((acc, entry) => acc + (entry.duration_seconds || 0), 0);
|
||||
});
|
||||
|
||||
const dueInText = computed(() => {
|
||||
if (!props.chore.next_due_date) return '';
|
||||
const dueDate = new Date(props.chore.next_due_date);
|
||||
if (isToday(dueDate)) return 'Today';
|
||||
return formatDistanceToNow(dueDate, { addSuffix: true });
|
||||
});
|
||||
|
||||
const toggleTimer = () => {
|
||||
if (isActiveTimer.value) {
|
||||
emit('stop-timer', props.chore, props.activeTimer!.id);
|
||||
} else {
|
||||
emit('start-timer', props.chore);
|
||||
}
|
||||
};
|
||||
|
||||
const getDueDateStatus = (chore: ChoreWithCompletion) => {
|
||||
if (chore.is_completed) return 'completed';
|
||||
|
||||
const today = new Date();
|
||||
today.setHours(0, 0, 0, 0);
|
||||
|
||||
const dueDate = new Date(chore.next_due_date);
|
||||
dueDate.setHours(0, 0, 0, 0);
|
||||
|
||||
if (dueDate < today) return 'overdue';
|
||||
if (dueDate.getTime() === today.getTime()) return 'due-today';
|
||||
return 'upcoming';
|
||||
};
|
||||
</script>
|
||||
|
||||
<script lang="ts">
|
||||
export default {
|
||||
name: 'ChoreItem'
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped lang="scss">
|
||||
/* Neo-style list items */
|
||||
.neo-list-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
padding: 0.75rem 0;
|
||||
border-bottom: 1px solid rgba(0, 0, 0, 0.08);
|
||||
position: relative;
|
||||
transition: background-color 0.2s ease;
|
||||
}
|
||||
|
||||
.neo-list-item:hover {
|
||||
background-color: #f8f8f8;
|
||||
}
|
||||
|
||||
.neo-list-item:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.neo-item-content {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
width: 100%;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.neo-item-actions {
|
||||
display: flex;
|
||||
gap: 0.25rem;
|
||||
opacity: 0;
|
||||
transition: opacity 0.2s ease;
|
||||
margin-left: auto;
|
||||
|
||||
.btn {
|
||||
margin-left: 0.25rem;
|
||||
}
|
||||
}
|
||||
|
||||
.neo-list-item:hover .neo-item-actions {
|
||||
opacity: 1;
|
||||
}
|
||||
|
||||
/* Custom Checkbox Styles */
|
||||
.neo-checkbox-label {
|
||||
display: grid;
|
||||
grid-template-columns: auto 1fr;
|
||||
align-items: center;
|
||||
gap: 0.8em;
|
||||
cursor: pointer;
|
||||
position: relative;
|
||||
width: 100%;
|
||||
font-weight: 500;
|
||||
color: #414856;
|
||||
transition: color 0.3s ease;
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"] {
|
||||
appearance: none;
|
||||
-webkit-appearance: none;
|
||||
-moz-appearance: none;
|
||||
position: relative;
|
||||
height: 20px;
|
||||
width: 20px;
|
||||
outline: none;
|
||||
border: 2px solid #b8c1d1;
|
||||
margin: 0;
|
||||
cursor: pointer;
|
||||
background: transparent;
|
||||
border-radius: 6px;
|
||||
display: grid;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
transition: all 0.2s cubic-bezier(0.4, 0, 0.2, 1);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:hover {
|
||||
border-color: var(--secondary);
|
||||
transform: scale(1.05);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]::before,
|
||||
.neo-checkbox-label input[type="checkbox"]::after {
|
||||
content: none;
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]::after {
|
||||
content: "";
|
||||
position: absolute;
|
||||
opacity: 0;
|
||||
left: 5px;
|
||||
top: 1px;
|
||||
width: 6px;
|
||||
height: 12px;
|
||||
border: solid var(--primary);
|
||||
border-width: 0 3px 3px 0;
|
||||
transform: rotate(45deg) scale(0);
|
||||
transition: all 0.2s cubic-bezier(0.18, 0.89, 0.32, 1.28);
|
||||
transition-property: transform, opacity;
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked {
|
||||
border-color: var(--primary);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked::after {
|
||||
opacity: 1;
|
||||
transform: rotate(45deg) scale(1);
|
||||
}
|
||||
|
||||
.checkbox-content {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 0.25rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.checkbox-text-span {
|
||||
position: relative;
|
||||
transition: color 0.4s ease, opacity 0.4s ease;
|
||||
width: fit-content;
|
||||
font-weight: 500;
|
||||
color: var(--dark);
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
|
||||
/* Animated strikethrough line */
|
||||
.checkbox-text-span::before {
|
||||
content: '';
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: -0.1em;
|
||||
right: -0.1em;
|
||||
height: 2px;
|
||||
background: var(--dark);
|
||||
transform: scaleX(0);
|
||||
transform-origin: right;
|
||||
transition: transform 0.4s cubic-bezier(0.77, 0, .18, 1);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked~.checkbox-content .checkbox-text-span {
|
||||
color: var(--dark);
|
||||
opacity: 0.6;
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked~.checkbox-content .checkbox-text-span::before {
|
||||
transform: scaleX(1);
|
||||
transform-origin: left;
|
||||
transition: transform 0.4s cubic-bezier(0.77, 0, .18, 1) 0.1s;
|
||||
}
|
||||
|
||||
.neo-completed-static {
|
||||
color: var(--dark);
|
||||
opacity: 0.6;
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.neo-completed-static::before {
|
||||
content: '';
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: -0.1em;
|
||||
right: -0.1em;
|
||||
height: 2px;
|
||||
background: var(--dark);
|
||||
transform: scaleX(1);
|
||||
transform-origin: left;
|
||||
}
|
||||
|
||||
/* Status-based styling */
|
||||
.status-completed {
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
/* Chore-specific styles */
|
||||
.chore-main-info {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.chore-badges {
|
||||
display: flex;
|
||||
gap: 0.25rem;
|
||||
}
|
||||
|
||||
.badge {
|
||||
font-size: 0.75rem;
|
||||
padding: 0.125rem 0.375rem;
|
||||
border-radius: 0.25rem;
|
||||
font-weight: 500;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.025em;
|
||||
}
|
||||
|
||||
.badge-group {
|
||||
background-color: #3b82f6;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.badge-overdue {
|
||||
background-color: #ef4444;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.badge-due-today {
|
||||
background-color: #f59e0b;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.badge-upcoming {
|
||||
background-color: #3b82f6;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.chore-description {
|
||||
font-size: 0.875rem;
|
||||
color: var(--dark);
|
||||
opacity: 0.8;
|
||||
margin-top: 0.25rem;
|
||||
font-style: italic;
|
||||
}
|
||||
|
||||
.item-time {
|
||||
font-size: 0.9rem;
|
||||
color: var(--dark);
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.total-time {
|
||||
font-size: 0.8rem;
|
||||
color: #666;
|
||||
margin-top: 0.25rem;
|
||||
}
|
||||
|
||||
.child-chore-list {
|
||||
list-style: none;
|
||||
padding-left: 2rem;
|
||||
margin-top: 0.5rem;
|
||||
border-left: 2px solid #e5e7eb;
|
||||
}
|
||||
</style>
|
@ -189,7 +189,7 @@
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, onMounted, watch } from 'vue';
|
||||
import { apiClient, API_ENDPOINTS } from '@/config/api';
|
||||
import { apiClient, API_ENDPOINTS } from '@/services/api';
|
||||
import { useNotificationStore } from '@/stores/notifications';
|
||||
import { useAuthStore } from '@/stores/auth';
|
||||
import type { ExpenseCreate, ExpenseSplitCreate } from '@/types/expense';
|
||||
|
102
fe/src/components/CreateGroupModal.vue
Normal file
102
fe/src/components/CreateGroupModal.vue
Normal file
@ -0,0 +1,102 @@
|
||||
<template>
|
||||
<VModal :model-value="isOpen" @update:model-value="closeModal" title="Create New Group">
|
||||
<template #default>
|
||||
<form @submit.prevent="onSubmit">
|
||||
<VFormField label="Group Name" :error-message="formError ?? undefined">
|
||||
<VInput type="text" v-model="groupName" required ref="groupNameInput" />
|
||||
</VFormField>
|
||||
</form>
|
||||
</template>
|
||||
<template #footer>
|
||||
<VButton variant="neutral" @click="closeModal" type="button">Cancel</VButton>
|
||||
<VButton type="submit" variant="primary" :disabled="loading" @click="onSubmit" class="ml-2">
|
||||
<VSpinner v-if="loading" size="sm" />
|
||||
Create
|
||||
</VButton>
|
||||
</template>
|
||||
</VModal>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, watch, nextTick } from 'vue';
|
||||
import { useVModel } from '@vueuse/core';
|
||||
import { apiClient, API_ENDPOINTS } from '@/services/api';
|
||||
import { useNotificationStore } from '@/stores/notifications';
|
||||
import VModal from '@/components/valerie/VModal.vue';
|
||||
import VFormField from '@/components/valerie/VFormField.vue';
|
||||
import VInput from '@/components/valerie/VInput.vue';
|
||||
import VButton from '@/components/valerie/VButton.vue';
|
||||
import VSpinner from '@/components/valerie/VSpinner.vue';
|
||||
|
||||
const props = defineProps<{
|
||||
modelValue: boolean;
|
||||
}>();
|
||||
|
||||
const emit = defineEmits<{
|
||||
(e: 'update:modelValue', value: boolean): void;
|
||||
(e: 'created', newGroup: any): void;
|
||||
}>();
|
||||
|
||||
const isOpen = useVModel(props, 'modelValue', emit);
|
||||
const groupName = ref('');
|
||||
const loading = ref(false);
|
||||
const formError = ref<string | null>(null);
|
||||
const notificationStore = useNotificationStore();
|
||||
const groupNameInput = ref<InstanceType<typeof VInput> | null>(null);
|
||||
|
||||
watch(isOpen, (newVal) => {
|
||||
if (newVal) {
|
||||
groupName.value = '';
|
||||
formError.value = null;
|
||||
nextTick(() => {
|
||||
// groupNameInput.value?.focus?.();
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
const closeModal = () => {
|
||||
isOpen.value = false;
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
formError.value = null;
|
||||
if (!groupName.value.trim()) {
|
||||
formError.value = 'Name is required';
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
};
|
||||
|
||||
const onSubmit = async () => {
|
||||
if (!validateForm()) {
|
||||
return;
|
||||
}
|
||||
loading.value = true;
|
||||
try {
|
||||
const payload = { name: groupName.value };
|
||||
const response = await apiClient.post(API_ENDPOINTS.GROUPS.BASE, payload);
|
||||
notificationStore.addNotification({ message: 'Group created successfully', type: 'success' });
|
||||
emit('created', response.data);
|
||||
closeModal();
|
||||
} catch (error: any) {
|
||||
const message = error?.response?.data?.detail || (error instanceof Error ? error.message : 'Failed to create group');
|
||||
formError.value = message;
|
||||
notificationStore.addNotification({ message, type: 'error' });
|
||||
console.error(message, error);
|
||||
} finally {
|
||||
loading.value = false;
|
||||
}
|
||||
};
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.form-error-text {
|
||||
color: var(--danger);
|
||||
font-size: 0.85rem;
|
||||
margin-top: 0.25rem;
|
||||
}
|
||||
|
||||
.ml-2 {
|
||||
margin-left: 0.5rem;
|
||||
}
|
||||
</style>
|
@ -29,7 +29,7 @@
|
||||
<script setup lang="ts">
|
||||
import { ref, watch, nextTick, computed } from 'vue';
|
||||
import { useVModel } from '@vueuse/core'; // onClickOutside removed
|
||||
import { apiClient, API_ENDPOINTS } from '@/config/api'; // Assuming this path is correct
|
||||
import { apiClient, API_ENDPOINTS } from '@/services/api'; // Assuming this path is correct
|
||||
import { useNotificationStore } from '@/stores/notifications';
|
||||
import VModal from '@/components/valerie/VModal.vue';
|
||||
import VFormField from '@/components/valerie/VFormField.vue';
|
||||
@ -38,6 +38,7 @@ import VTextarea from '@/components/valerie/VTextarea.vue';
|
||||
import VSelect from '@/components/valerie/VSelect.vue';
|
||||
import VButton from '@/components/valerie/VButton.vue';
|
||||
import VSpinner from '@/components/valerie/VSpinner.vue';
|
||||
import type { Group } from '@/types/group';
|
||||
|
||||
const props = defineProps<{
|
||||
modelValue: boolean;
|
||||
|
142
fe/src/components/list-detail/CostSummaryDialog.vue
Normal file
142
fe/src/components/list-detail/CostSummaryDialog.vue
Normal file
@ -0,0 +1,142 @@
|
||||
<template>
|
||||
<VModal :model-value="modelValue" :title="$t('listDetailPage.modals.costSummary.title')"
|
||||
@update:modelValue="$emit('update:modelValue', false)" size="lg">
|
||||
<template #default>
|
||||
<div v-if="loading" class="text-center">
|
||||
<VSpinner :label="$t('listDetailPage.loading.costSummary')" />
|
||||
</div>
|
||||
<VAlert v-else-if="error" type="error" :message="error" />
|
||||
<div v-else-if="summary">
|
||||
<div class="mb-3 cost-overview">
|
||||
<p><strong>{{ $t('listDetailPage.modals.costSummary.totalCostLabel') }}</strong> {{
|
||||
formatCurrency(summary.total_list_cost) }}</p>
|
||||
<p><strong>{{ $t('listDetailPage.modals.costSummary.equalShareLabel') }}</strong> {{
|
||||
formatCurrency(summary.equal_share_per_user) }}</p>
|
||||
<p><strong>{{ $t('listDetailPage.modals.costSummary.participantsLabel') }}</strong> {{
|
||||
summary.num_participating_users }}</p>
|
||||
</div>
|
||||
<h4>{{ $t('listDetailPage.modals.costSummary.userBalancesHeader') }}</h4>
|
||||
<div class="table-container mt-2">
|
||||
<table class="table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>{{ $t('listDetailPage.modals.costSummary.tableHeaders.user') }}</th>
|
||||
<th class="text-right">{{
|
||||
$t('listDetailPage.modals.costSummary.tableHeaders.itemsAddedValue') }}</th>
|
||||
<th class="text-right">{{ $t('listDetailPage.modals.costSummary.tableHeaders.amountDue')
|
||||
}}</th>
|
||||
<th class="text-right">{{ $t('listDetailPage.modals.costSummary.tableHeaders.balance')
|
||||
}}</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr v-for="userShare in summary.user_balances" :key="userShare.user_id">
|
||||
<td>{{ userShare.user_identifier }}</td>
|
||||
<td class="text-right">{{ formatCurrency(userShare.items_added_value) }}</td>
|
||||
<td class="text-right">{{ formatCurrency(userShare.amount_due) }}</td>
|
||||
<td class="text-right">
|
||||
<VBadge :text="formatCurrency(userShare.balance)"
|
||||
:variant="parseFloat(String(userShare.balance)) >= 0 ? 'settled' : 'pending'" />
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
<p v-else>{{ $t('listDetailPage.modals.costSummary.emptyState') }}</p>
|
||||
</template>
|
||||
<template #footer>
|
||||
<VButton variant="primary" @click="$emit('update:modelValue', false)">{{ $t('listDetailPage.buttons.close')
|
||||
}}</VButton>
|
||||
</template>
|
||||
</VModal>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { defineProps, defineEmits } from 'vue';
|
||||
import type { PropType } from 'vue';
|
||||
import { useI18n } from 'vue-i18n';
|
||||
import VModal from '@/components/valerie/VModal.vue';
|
||||
import VSpinner from '@/components/valerie/VSpinner.vue';
|
||||
import VAlert from '@/components/valerie/VAlert.vue';
|
||||
import VBadge from '@/components/valerie/VBadge.vue';
|
||||
import VButton from '@/components/valerie/VButton.vue';
|
||||
|
||||
interface UserCostShare {
|
||||
user_id: number;
|
||||
user_identifier: string;
|
||||
items_added_value: string | number;
|
||||
amount_due: string | number;
|
||||
balance: string | number;
|
||||
}
|
||||
|
||||
interface ListCostSummaryData {
|
||||
list_id: number;
|
||||
list_name: string;
|
||||
total_list_cost: string | number;
|
||||
num_participating_users: number;
|
||||
equal_share_per_user: string | number;
|
||||
user_balances: UserCostShare[];
|
||||
}
|
||||
|
||||
defineProps({
|
||||
modelValue: {
|
||||
type: Boolean,
|
||||
required: true,
|
||||
},
|
||||
summary: {
|
||||
type: Object as PropType<ListCostSummaryData | null>,
|
||||
default: null,
|
||||
},
|
||||
loading: {
|
||||
type: Boolean,
|
||||
default: false,
|
||||
},
|
||||
error: {
|
||||
type: String as PropType<string | null>,
|
||||
default: null,
|
||||
},
|
||||
});
|
||||
|
||||
defineEmits(['update:modelValue']);
|
||||
|
||||
const { t } = useI18n();
|
||||
|
||||
const formatCurrency = (value: string | number | undefined | null): string => {
|
||||
if (value === undefined || value === null) return '$0.00';
|
||||
if (typeof value === 'string' && !value.trim()) return '$0.00';
|
||||
const numValue = typeof value === 'string' ? parseFloat(value) : value;
|
||||
return isNaN(numValue) ? '$0.00' : `$${numValue.toFixed(2)}`;
|
||||
};
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.cost-overview p {
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.table-container {
|
||||
overflow-x: auto;
|
||||
}
|
||||
|
||||
.table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
}
|
||||
|
||||
.table th,
|
||||
.table td {
|
||||
padding: 0.75rem;
|
||||
border-bottom: 1px solid #eee;
|
||||
}
|
||||
|
||||
.table th {
|
||||
text-align: left;
|
||||
font-weight: 600;
|
||||
background-color: #f8f9fa;
|
||||
}
|
||||
|
||||
.text-right {
|
||||
text-align: right;
|
||||
}
|
||||
</style>
|
384
fe/src/components/list-detail/ExpenseSection.vue
Normal file
384
fe/src/components/list-detail/ExpenseSection.vue
Normal file
@ -0,0 +1,384 @@
|
||||
<template>
|
||||
<section class="neo-expenses-section">
|
||||
<VCard v-if="isLoading && expenses.length === 0" class="py-10 text-center">
|
||||
<VSpinner :label="$t('listDetailPage.expensesSection.loading')" size="lg" />
|
||||
</VCard>
|
||||
<VAlert v-else-if="error && expenses.length === 0" type="error" class="mt-4">
|
||||
<p>{{ error }}</p>
|
||||
<template #actions>
|
||||
<VButton @click="$emit('retry-fetch')">
|
||||
{{ $t('listDetailPage.expensesSection.retryButton') }}
|
||||
</VButton>
|
||||
</template>
|
||||
</VAlert>
|
||||
<VCard v-else-if="(!expenses || expenses.length === 0) && !isLoading" variant="empty-state" empty-icon="receipt"
|
||||
:empty-title="$t('listDetailPage.expensesSection.emptyStateTitle')"
|
||||
:empty-message="$t('listDetailPage.expensesSection.emptyStateMessage')" class="mt-4">
|
||||
</VCard>
|
||||
<div v-else class="neo-expense-list">
|
||||
<div v-for="expense in expenses" :key="expense.id" class="neo-expense-item-wrapper">
|
||||
<div class="neo-expense-item" @click="toggleExpense(expense.id)"
|
||||
:class="{ 'is-expanded': isExpenseExpanded(expense.id) }">
|
||||
<div class="expense-main-content">
|
||||
<div class="expense-icon-container">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"
|
||||
fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round"
|
||||
stroke-linejoin="round">
|
||||
<line x1="12" x2="12" y1="2" y2="22"></line>
|
||||
<path d="M17 5H9.5a3.5 3.5 0 0 0 0 7h5a3.5 3.5 0 0 1 0 7H6"></path>
|
||||
</svg>
|
||||
</div>
|
||||
<div class="expense-text-content">
|
||||
<div class="neo-expense-header">
|
||||
{{ expense.description }}
|
||||
</div>
|
||||
<div class="neo-expense-details">
|
||||
{{ formatCurrency(expense.total_amount) }} —
|
||||
{{ $t('listDetailPage.expensesSection.paidBy') }} <strong>{{ expense.paid_by_user?.name
|
||||
||
|
||||
expense.paid_by_user?.email }}</strong>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="expense-side-content">
|
||||
<span class="neo-expense-status" :class="getStatusClass(expense.overall_settlement_status)">
|
||||
{{ getOverallExpenseStatusText(expense.overall_settlement_status) }}
|
||||
</span>
|
||||
<div class="expense-toggle-icon">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24"
|
||||
fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round"
|
||||
stroke-linejoin="round" class="feather feather-chevron-down">
|
||||
<polyline points="6 9 12 15 18 9"></polyline>
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Collapsible content -->
|
||||
<div v-if="isExpenseExpanded(expense.id)" class="neo-splits-container">
|
||||
<div class="neo-splits-list">
|
||||
<div v-for="split in expense.splits" :key="split.id" class="neo-split-item">
|
||||
<div class="split-col split-user">
|
||||
<strong>{{ split.user?.name || split.user?.email || `User ID: ${split.user_id}`
|
||||
}}</strong>
|
||||
</div>
|
||||
<div class="split-col split-owes">
|
||||
{{ $t('listDetailPage.expensesSection.owes') }} <strong>{{
|
||||
formatCurrency(split.owed_amount) }}</strong>
|
||||
</div>
|
||||
<div class="split-col split-status">
|
||||
<span class="neo-expense-status" :class="getStatusClass(split.status)">
|
||||
{{ getSplitStatusText(split.status) }}
|
||||
</span>
|
||||
</div>
|
||||
<div class="split-col split-paid-info">
|
||||
<div v-if="split.paid_at" class="paid-details">
|
||||
{{ $t('listDetailPage.expensesSection.paidAmount') }} {{
|
||||
getPaidAmountForSplitDisplay(split)
|
||||
}}
|
||||
<span v-if="split.paid_at"> {{ $t('listDetailPage.expensesSection.onDate') }} {{ new
|
||||
Date(split.paid_at).toLocaleDateString() }}</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="split-col split-action">
|
||||
<button
|
||||
v-if="split.user_id === currentUserId && split.status !== ExpenseSplitStatusEnum.PAID"
|
||||
class="btn btn-sm btn-primary" @click="$emit('settle-share', expense, split)"
|
||||
:disabled="isSettlementLoading">
|
||||
{{ $t('listDetailPage.expensesSection.settleShareButton') }}
|
||||
</button>
|
||||
</div>
|
||||
<ul v-if="split.settlement_activities && split.settlement_activities.length > 0"
|
||||
class="neo-settlement-activities">
|
||||
<li v-for="activity in split.settlement_activities" :key="activity.id">
|
||||
{{ $t('listDetailPage.expensesSection.activityLabel') }} {{
|
||||
formatCurrency(activity.amount_paid) }}
|
||||
{{
|
||||
$t('listDetailPage.expensesSection.byUser') }} {{ activity.payer?.name || `User
|
||||
${activity.paid_by_user_id}` }} {{ $t('listDetailPage.expensesSection.onDate') }} {{
|
||||
new
|
||||
Date(activity.paid_at).toLocaleDateString() }}
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, defineProps, defineEmits } from 'vue';
|
||||
import type { PropType } from 'vue';
|
||||
import { useI18n } from 'vue-i18n';
|
||||
import type { Expense, ExpenseSplit } from '@/types/expense';
|
||||
import { ExpenseOverallStatusEnum, ExpenseSplitStatusEnum } from '@/types/expense';
|
||||
import { useListDetailStore } from '@/stores/listDetailStore';
|
||||
import VCard from '@/components/valerie/VCard.vue';
|
||||
import VSpinner from '@/components/valerie/VSpinner.vue';
|
||||
import VAlert from '@/components/valerie/VAlert.vue';
|
||||
import VButton from '@/components/valerie/VButton.vue';
|
||||
|
||||
const props = defineProps({
|
||||
expenses: {
|
||||
type: Array as PropType<Expense[]>,
|
||||
required: true,
|
||||
},
|
||||
isLoading: {
|
||||
type: Boolean,
|
||||
default: false,
|
||||
},
|
||||
error: {
|
||||
type: String as PropType<string | null>,
|
||||
default: null,
|
||||
},
|
||||
currentUserId: {
|
||||
type: Number as PropType<number | null>,
|
||||
required: true,
|
||||
},
|
||||
isSettlementLoading: {
|
||||
type: Boolean,
|
||||
default: false,
|
||||
}
|
||||
});
|
||||
|
||||
defineEmits(['retry-fetch', 'settle-share']);
|
||||
|
||||
const { t } = useI18n();
|
||||
const listDetailStore = useListDetailStore();
|
||||
|
||||
const expandedExpenses = ref<Set<number>>(new Set());
|
||||
|
||||
const toggleExpense = (expenseId: number) => {
|
||||
const newSet = new Set(expandedExpenses.value);
|
||||
if (newSet.has(expenseId)) {
|
||||
newSet.delete(expenseId);
|
||||
} else {
|
||||
newSet.add(expenseId);
|
||||
}
|
||||
expandedExpenses.value = newSet;
|
||||
};
|
||||
|
||||
const isExpenseExpanded = (expenseId: number) => {
|
||||
return expandedExpenses.value.has(expenseId);
|
||||
};
|
||||
|
||||
const formatCurrency = (value: string | number | undefined | null): string => {
|
||||
if (value === undefined || value === null) return '$0.00';
|
||||
if (typeof value === 'string' && !value.trim()) return '$0.00';
|
||||
const numValue = typeof value === 'string' ? parseFloat(value) : value;
|
||||
return isNaN(numValue) ? '$0.00' : `$${numValue.toFixed(2)}`;
|
||||
};
|
||||
|
||||
const getPaidAmountForSplitDisplay = (split: ExpenseSplit): string => {
|
||||
const amount = listDetailStore.getPaidAmountForSplit(split.id);
|
||||
return formatCurrency(amount);
|
||||
};
|
||||
|
||||
const getSplitStatusText = (status: ExpenseSplitStatusEnum): string => {
|
||||
switch (status) {
|
||||
case ExpenseSplitStatusEnum.PAID: return t('listDetailPage.status.paid');
|
||||
case ExpenseSplitStatusEnum.PARTIALLY_PAID: return t('listDetailPage.status.partiallyPaid');
|
||||
case ExpenseSplitStatusEnum.UNPAID: return t('listDetailPage.status.unpaid');
|
||||
default: return t('listDetailPage.status.unknown');
|
||||
}
|
||||
};
|
||||
|
||||
const getOverallExpenseStatusText = (status: ExpenseOverallStatusEnum): string => {
|
||||
switch (status) {
|
||||
case ExpenseOverallStatusEnum.PAID: return t('listDetailPage.status.settled');
|
||||
case ExpenseOverallStatusEnum.PARTIALLY_PAID: return t('listDetailPage.status.partiallySettled');
|
||||
case ExpenseOverallStatusEnum.UNPAID: return t('listDetailPage.status.unsettled');
|
||||
default: return t('listDetailPage.status.unknown');
|
||||
}
|
||||
};
|
||||
|
||||
const getStatusClass = (status: ExpenseSplitStatusEnum | ExpenseOverallStatusEnum): string => {
|
||||
if (status === ExpenseSplitStatusEnum.PAID || status === ExpenseOverallStatusEnum.PAID) return 'status-paid';
|
||||
if (status === ExpenseSplitStatusEnum.PARTIALLY_PAID || status === ExpenseOverallStatusEnum.PARTIALLY_PAID) return 'status-partially_paid';
|
||||
if (status === ExpenseSplitStatusEnum.UNPAID || status === ExpenseOverallStatusEnum.UNPAID) return 'status-unpaid';
|
||||
return '';
|
||||
};
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.neo-expenses-section {
|
||||
padding: 0;
|
||||
margin-top: 1.2rem;
|
||||
}
|
||||
|
||||
.neo-expense-list {
|
||||
background-color: rgb(255, 248, 240);
|
||||
border-radius: 12px;
|
||||
overflow: hidden;
|
||||
border: 1px solid #f0e5d8;
|
||||
}
|
||||
|
||||
.neo-expense-item-wrapper {
|
||||
border-bottom: 1px solid #f0e5d8;
|
||||
}
|
||||
|
||||
.neo-expense-item-wrapper:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.neo-expense-item {
|
||||
padding: 1rem 1.2rem;
|
||||
cursor: pointer;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
transition: background-color 0.2s ease;
|
||||
}
|
||||
|
||||
.neo-expense-item:hover {
|
||||
background-color: rgba(0, 0, 0, 0.02);
|
||||
}
|
||||
|
||||
.neo-expense-item.is-expanded .expense-toggle-icon {
|
||||
transform: rotate(180deg);
|
||||
}
|
||||
|
||||
.expense-main-content {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.expense-icon-container {
|
||||
color: #d99a53;
|
||||
}
|
||||
|
||||
.expense-text-content {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.expense-side-content {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.expense-toggle-icon {
|
||||
color: #888;
|
||||
transition: transform 0.3s ease;
|
||||
}
|
||||
|
||||
.neo-expense-header {
|
||||
font-size: 1.1rem;
|
||||
font-weight: 600;
|
||||
margin-bottom: 0.1rem;
|
||||
}
|
||||
|
||||
.neo-expense-details,
|
||||
.neo-split-details {
|
||||
font-size: 0.9rem;
|
||||
color: #555;
|
||||
margin-bottom: 0.3rem;
|
||||
}
|
||||
|
||||
.neo-expense-details strong,
|
||||
.neo-split-details strong {
|
||||
color: #111;
|
||||
}
|
||||
|
||||
.neo-expense-status {
|
||||
display: inline-block;
|
||||
padding: 0.25em 0.6em;
|
||||
font-size: 0.85em;
|
||||
font-weight: 700;
|
||||
line-height: 1;
|
||||
text-align: center;
|
||||
white-space: nowrap;
|
||||
vertical-align: baseline;
|
||||
border-radius: 0.375rem;
|
||||
margin-left: 0.5rem;
|
||||
color: #22c55e;
|
||||
}
|
||||
|
||||
.status-unpaid {
|
||||
background-color: #fee2e2;
|
||||
color: #dc2626;
|
||||
}
|
||||
|
||||
.status-partially_paid {
|
||||
background-color: #ffedd5;
|
||||
color: #f97316;
|
||||
}
|
||||
|
||||
.status-paid {
|
||||
background-color: #dcfce7;
|
||||
color: #22c55e;
|
||||
}
|
||||
|
||||
.neo-splits-container {
|
||||
padding: 0.5rem 1.2rem 1.2rem;
|
||||
background-color: rgba(255, 255, 255, 0.5);
|
||||
}
|
||||
|
||||
.neo-splits-list {
|
||||
margin-top: 0rem;
|
||||
padding-left: 0;
|
||||
border-left: none;
|
||||
}
|
||||
|
||||
.neo-split-item {
|
||||
padding: 0.75rem 0;
|
||||
border-bottom: 1px dashed #f0e5d8;
|
||||
display: grid;
|
||||
grid-template-areas:
|
||||
"user owes status paid action"
|
||||
"activities activities activities activities activities";
|
||||
grid-template-columns: 1.5fr 1fr 1fr 1.5fr auto;
|
||||
gap: 0.5rem 1rem;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.neo-split-item:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.split-col.split-user {
|
||||
grid-area: user;
|
||||
}
|
||||
|
||||
.split-col.split-owes {
|
||||
grid-area: owes;
|
||||
}
|
||||
|
||||
.split-col.split-status {
|
||||
grid-area: status;
|
||||
}
|
||||
|
||||
.split-col.split-paid-info {
|
||||
grid-area: paid;
|
||||
}
|
||||
|
||||
.split-col.split-action {
|
||||
grid-area: action;
|
||||
justify-self: end;
|
||||
}
|
||||
|
||||
.split-col.neo-settlement-activities {
|
||||
grid-area: activities;
|
||||
font-size: 0.8em;
|
||||
color: #555;
|
||||
padding-left: 1em;
|
||||
list-style-type: disc;
|
||||
margin-top: 0.5em;
|
||||
}
|
||||
|
||||
.neo-settlement-activities {
|
||||
font-size: 0.8em;
|
||||
color: #555;
|
||||
padding-left: 1em;
|
||||
list-style-type: disc;
|
||||
margin-top: 0.5em;
|
||||
}
|
||||
|
||||
.neo-settlement-activities li {
|
||||
margin-top: 0.2em;
|
||||
}
|
||||
</style>
|
255
fe/src/components/list-detail/ItemsList.vue
Normal file
255
fe/src/components/list-detail/ItemsList.vue
Normal file
@ -0,0 +1,255 @@
|
||||
<template>
|
||||
<div class="neo-item-list-cotainer">
|
||||
<div v-for="group in groupedItems" :key="group.categoryName" class="category-group"
|
||||
:class="{ 'highlight': supermarktMode && group.items.some(i => i.is_complete) }">
|
||||
<h3 v-if="group.items.length" class="category-header">{{ group.categoryName }}</h3>
|
||||
<draggable :list="group.items" item-key="id" handle=".drag-handle" @end="handleDragEnd"
|
||||
:disabled="!isOnline || supermarktMode" class="neo-item-list" ghost-class="sortable-ghost"
|
||||
drag-class="sortable-drag">
|
||||
<template #item="{ element: item }">
|
||||
<ListItem :item="item" :is-online="isOnline" :category-options="categoryOptions"
|
||||
:supermarkt-mode="supermarktMode" @delete-item="$emit('delete-item', item)"
|
||||
@checkbox-change="(item, checked) => $emit('checkbox-change', item, checked)"
|
||||
@update-price="$emit('update-price', item)" @start-edit="$emit('start-edit', item)"
|
||||
@save-edit="$emit('save-edit', item)" @cancel-edit="$emit('cancel-edit', item)"
|
||||
@update:editName="item.editName = $event" @update:editQuantity="item.editQuantity = $event"
|
||||
@update:editCategoryId="item.editCategoryId = $event"
|
||||
@update:priceInput="item.priceInput = $event" />
|
||||
</template>
|
||||
</draggable>
|
||||
</div>
|
||||
|
||||
<!-- New Add Item LI, integrated into the list -->
|
||||
<li class="neo-list-item new-item-input-container" v-show="!supermarktMode">
|
||||
<label class="neo-checkbox-label">
|
||||
<input type="checkbox" disabled />
|
||||
<input type="text" class="neo-new-item-input"
|
||||
:placeholder="$t('listDetailPage.items.addItemForm.placeholder')" ref="itemNameInputRef"
|
||||
:value="newItem.name"
|
||||
@input="$emit('update:newItemName', ($event.target as HTMLInputElement).value)"
|
||||
@keyup.enter="$emit('add-item')" @blur="handleNewItemBlur" @click.stop />
|
||||
<VSelect
|
||||
:model-value="newItem.category_id === null || newItem.category_id === undefined ? '' : newItem.category_id"
|
||||
@update:modelValue="$emit('update:newItemCategoryId', $event === '' ? null : $event)"
|
||||
:options="safeCategoryOptions" placeholder="Category" class="w-40" size="sm" />
|
||||
</label>
|
||||
</li>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, defineProps, defineEmits } from 'vue';
|
||||
import type { PropType } from 'vue';
|
||||
import draggable from 'vuedraggable';
|
||||
import { useI18n } from 'vue-i18n';
|
||||
import ListItem from './ListItem.vue';
|
||||
import VSelect from '@/components/valerie/VSelect.vue';
|
||||
import type { Item } from '@/types/item';
|
||||
|
||||
interface ItemWithUI extends Item {
|
||||
updating: boolean;
|
||||
deleting: boolean;
|
||||
priceInput: string | number | null;
|
||||
swiped: boolean;
|
||||
isEditing?: boolean;
|
||||
editName?: string;
|
||||
editQuantity?: number | string | null;
|
||||
editCategoryId?: number | null;
|
||||
showFirework?: boolean;
|
||||
group_id?: number;
|
||||
}
|
||||
|
||||
const props = defineProps({
|
||||
items: {
|
||||
type: Array as PropType<ItemWithUI[]>,
|
||||
required: true,
|
||||
},
|
||||
isOnline: {
|
||||
type: Boolean,
|
||||
required: true,
|
||||
},
|
||||
supermarktMode: {
|
||||
type: Boolean,
|
||||
default: false,
|
||||
},
|
||||
categoryOptions: {
|
||||
type: Array as PropType<{ label: string; value: number | null }[]>,
|
||||
required: true,
|
||||
},
|
||||
newItem: {
|
||||
type: Object as PropType<{ name: string; category_id?: number | null }>,
|
||||
required: true,
|
||||
},
|
||||
categories: {
|
||||
type: Array as PropType<{ id: number; name: string }[]>,
|
||||
required: true,
|
||||
}
|
||||
});
|
||||
|
||||
const emit = defineEmits([
|
||||
'delete-item',
|
||||
'checkbox-change',
|
||||
'update-price',
|
||||
'start-edit',
|
||||
'save-edit',
|
||||
'cancel-edit',
|
||||
'add-item',
|
||||
'handle-drag-end',
|
||||
'update:newItemName',
|
||||
'update:newItemCategoryId',
|
||||
]);
|
||||
|
||||
const { t } = useI18n();
|
||||
const itemNameInputRef = ref<HTMLInputElement | null>(null);
|
||||
|
||||
const safeCategoryOptions = computed(() => props.categoryOptions.map(opt => ({
|
||||
...opt,
|
||||
value: opt.value === null ? '' : opt.value
|
||||
})));
|
||||
|
||||
const groupedItems = computed(() => {
|
||||
const groups: Record<string, { categoryName: string; items: ItemWithUI[] }> = {};
|
||||
|
||||
props.items.forEach(item => {
|
||||
const categoryId = item.category_id;
|
||||
const category = props.categories.find(c => c.id === categoryId);
|
||||
const categoryName = category ? category.name : t('listDetailPage.items.noCategory');
|
||||
|
||||
if (!groups[categoryName]) {
|
||||
groups[categoryName] = { categoryName, items: [] };
|
||||
}
|
||||
groups[categoryName].items.push(item);
|
||||
});
|
||||
|
||||
return Object.values(groups);
|
||||
});
|
||||
|
||||
const handleDragEnd = (evt: any) => {
|
||||
// We need to find the original item and its new global index
|
||||
const item = evt.item.__vue__.$props.item;
|
||||
let newIndex = 0;
|
||||
let found = false;
|
||||
|
||||
for (const group of groupedItems.value) {
|
||||
if (found) break;
|
||||
for (const i of group.items) {
|
||||
if (i.id === item.id) {
|
||||
found = true;
|
||||
break;
|
||||
}
|
||||
newIndex++;
|
||||
}
|
||||
}
|
||||
|
||||
// Create a new event object with the necessary info
|
||||
const newEvt = {
|
||||
item,
|
||||
newIndex: newIndex,
|
||||
oldIndex: evt.oldIndex, // This oldIndex is relative to the group
|
||||
};
|
||||
|
||||
emit('handle-drag-end', newEvt);
|
||||
};
|
||||
|
||||
const handleNewItemBlur = (event: FocusEvent) => {
|
||||
const inputElement = event.target as HTMLInputElement;
|
||||
if (inputElement.value.trim()) {
|
||||
emit('add-item');
|
||||
}
|
||||
};
|
||||
|
||||
const focusNewItemInput = () => {
|
||||
itemNameInputRef.value?.focus();
|
||||
}
|
||||
|
||||
defineExpose({
|
||||
focusNewItemInput
|
||||
});
|
||||
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.neo-checkbox-label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
|
||||
.neo-item-list-container {
|
||||
border: 3px solid #111;
|
||||
border-radius: 18px;
|
||||
background: var(--light);
|
||||
box-shadow: 6px 6px 0 #111;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.neo-item-list {
|
||||
list-style: none;
|
||||
padding: 1.2rem;
|
||||
padding-inline: 0;
|
||||
margin-bottom: 0;
|
||||
border-bottom: 1px solid #eee;
|
||||
background: var(--light);
|
||||
}
|
||||
|
||||
.new-item-input-container {
|
||||
list-style: none !important;
|
||||
padding-inline: 3rem;
|
||||
padding-bottom: 1.2rem;
|
||||
}
|
||||
|
||||
.new-item-input-container .neo-checkbox-label {
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.neo-new-item-input {
|
||||
all: unset;
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
font-size: 1.05rem;
|
||||
font-weight: 500;
|
||||
color: #444;
|
||||
padding: 0.2rem 0;
|
||||
border-bottom: 1px dashed #ccc;
|
||||
transition: border-color 0.2s ease;
|
||||
}
|
||||
|
||||
.neo-new-item-input:focus {
|
||||
border-bottom-color: var(--secondary);
|
||||
}
|
||||
|
||||
.neo-new-item-input::placeholder {
|
||||
color: #999;
|
||||
font-weight: 400;
|
||||
}
|
||||
|
||||
.sortable-ghost {
|
||||
opacity: 0.5;
|
||||
background: #f0f0f0;
|
||||
}
|
||||
|
||||
.sortable-drag {
|
||||
background: white;
|
||||
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.category-group {
|
||||
margin-bottom: 1.5rem;
|
||||
}
|
||||
|
||||
.category-header {
|
||||
font-size: 1.5rem;
|
||||
font-weight: 700;
|
||||
margin-bottom: 0.75rem;
|
||||
padding: 0 1.2rem;
|
||||
}
|
||||
|
||||
.category-group.highlight .neo-list-item:not(.is-complete) {
|
||||
background-color: #e6f7ff;
|
||||
}
|
||||
|
||||
.w-40 {
|
||||
width: 20%;
|
||||
}
|
||||
</style>
|
497
fe/src/components/list-detail/ListItem.vue
Normal file
497
fe/src/components/list-detail/ListItem.vue
Normal file
@ -0,0 +1,497 @@
|
||||
<template>
|
||||
<li class="neo-list-item"
|
||||
:class="{ 'bg-gray-100 opacity-70': item.is_complete, 'item-pending-sync': isItemPendingSync }">
|
||||
<div class="neo-item-content">
|
||||
<!-- Drag Handle -->
|
||||
<div class="drag-handle" v-if="isOnline && !supermarktMode">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24" fill="none"
|
||||
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<circle cx="9" cy="12" r="1"></circle>
|
||||
<circle cx="9" cy="5" r="1"></circle>
|
||||
<circle cx="9" cy="19" r="1"></circle>
|
||||
<circle cx="15" cy="12" r="1"></circle>
|
||||
<circle cx="15" cy="5" r="1"></circle>
|
||||
<circle cx="15" cy="19" r="1"></circle>
|
||||
</svg>
|
||||
</div>
|
||||
<!-- Content when NOT editing -->
|
||||
<template v-if="!item.isEditing">
|
||||
<label class="neo-checkbox-label" @click.stop>
|
||||
<input type="checkbox" :checked="item.is_complete"
|
||||
@change="$emit('checkbox-change', item, ($event.target as HTMLInputElement).checked)" />
|
||||
<div class="checkbox-content">
|
||||
<div class="item-text-container">
|
||||
<span class="checkbox-text-span"
|
||||
:class="{ 'neo-completed-static': item.is_complete && !item.updating }">
|
||||
{{ item.name }}
|
||||
</span>
|
||||
<span v-if="item.quantity" class="text-sm text-gray-500 ml-1">× {{ item.quantity }}</span>
|
||||
|
||||
<!-- User Information -->
|
||||
<div class="item-user-info" v-if="item.added_by_user || item.completed_by_user">
|
||||
<span v-if="item.added_by_user" class="user-badge added-by"
|
||||
:title="$t('listDetailPage.items.addedByTooltip', { name: item.added_by_user.name })">
|
||||
{{ $t('listDetailPage.items.addedBy') }} {{ item.added_by_user.name }}
|
||||
</span>
|
||||
<span v-if="item.is_complete && item.completed_by_user" class="user-badge completed-by"
|
||||
:title="$t('listDetailPage.items.completedByTooltip', { name: item.completed_by_user.name })">
|
||||
{{ $t('listDetailPage.items.completedBy') }} {{ item.completed_by_user.name }}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
<div v-if="item.is_complete" class="neo-price-input">
|
||||
<VInput type="number" :model-value="item.priceInput || ''" @update:modelValue="onPriceInput"
|
||||
:placeholder="$t('listDetailPage.items.pricePlaceholder')" size="sm" class="w-24"
|
||||
step="0.01" @blur="$emit('update-price', item)"
|
||||
@keydown.enter.prevent="($event.target as HTMLInputElement).blur()" />
|
||||
</div>
|
||||
</div>
|
||||
</label>
|
||||
<div class="neo-item-actions" v-if="!supermarktMode">
|
||||
<button class="neo-icon-button neo-edit-button" @click.stop="$emit('start-edit', item)"
|
||||
:aria-label="$t('listDetailPage.items.editItemAriaLabel')">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24" fill="none"
|
||||
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<path d="M11 4H4a2 2 0 0 0-2 2v14a2 2 0 0 0 2 2h14a2 2 0 0 0 2-2v-7"></path>
|
||||
<path d="M18.5 2.5a2.121 2.121 0 0 1 3 3L12 15l-4 1 1-4 9.5-9.5z"></path>
|
||||
</svg>
|
||||
</button>
|
||||
<button class="neo-icon-button neo-delete-button" @click.stop="$emit('delete-item', item)"
|
||||
:disabled="item.deleting" :aria-label="$t('listDetailPage.items.deleteItemAriaLabel')">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24" fill="none"
|
||||
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<path d="M3 6h18"></path>
|
||||
<path d="M19 6v14a2 2 0 0 1-2 2H7a2 2 0 0 1-2-2V6m3 0V4a2 2 0 0 1 2-2h4a2 2 0 0 1 2 2v2">
|
||||
</path>
|
||||
<line x1="10" y1="11" x2="10" y2="17"></line>
|
||||
<line x1="14" y1="11" x2="14" y2="17"></line>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
</template>
|
||||
<!-- Content WHEN editing -->
|
||||
<template v-else>
|
||||
<div class="inline-edit-form flex-grow flex items-center gap-2">
|
||||
<VInput type="text" :model-value="item.editName ?? ''"
|
||||
@update:modelValue="$emit('update:editName', $event)" required class="flex-grow" size="sm"
|
||||
@keydown.enter.prevent="$emit('save-edit', item)"
|
||||
@keydown.esc.prevent="$emit('cancel-edit', item)" />
|
||||
<VInput type="number" :model-value="item.editQuantity || ''"
|
||||
@update:modelValue="$emit('update:editQuantity', $event)" min="1" class="w-20" size="sm"
|
||||
@keydown.enter.prevent="$emit('save-edit', item)"
|
||||
@keydown.esc.prevent="$emit('cancel-edit', item)" />
|
||||
<VSelect :model-value="categoryModel" @update:modelValue="categoryModel = $event"
|
||||
:options="safeCategoryOptions" placeholder="Category" class="w-40" size="sm" />
|
||||
</div>
|
||||
<div class="neo-item-actions">
|
||||
<button class="neo-icon-button neo-save-button" @click.stop="$emit('save-edit', item)"
|
||||
:aria-label="$t('listDetailPage.buttons.saveChanges')">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24" fill="none"
|
||||
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<path d="M19 21H5a2 2 0 0 1-2-2V5a2 2 0 0 1 2-2h11l5 5v11a2 2 0 0 1-2 2z"></path>
|
||||
<polyline points="17 21 17 13 7 13 7 21"></polyline>
|
||||
<polyline points="7 3 7 8 15 8"></polyline>
|
||||
</svg>
|
||||
</button>
|
||||
<button class="neo-icon-button neo-cancel-button" @click.stop="$emit('cancel-edit', item)"
|
||||
:aria-label="$t('listDetailPage.buttons.cancel')">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24" fill="none"
|
||||
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<circle cx="12" cy="12" r="10"></circle>
|
||||
<line x1="15" y1="9" x2="9" y2="15"></line>
|
||||
<line x1="9" y1="9" x2="15" y2="15"></line>
|
||||
</svg>
|
||||
</button>
|
||||
<button class="neo-icon-button neo-delete-button" @click.stop="$emit('delete-item', item)"
|
||||
:disabled="item.deleting" :aria-label="$t('listDetailPage.items.deleteItemAriaLabel')">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24" fill="none"
|
||||
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<path d="M3 6h18"></path>
|
||||
<path d="M19 6v14a2 2 0 0 1-2 2H7a2 2 0 0 1-2-2V6m3 0V4a2 2 0 0 1 2-2h4a2 2 0 0 1 2 2v2">
|
||||
</path>
|
||||
<line x1="10" y1="11" x2="10" y2="17"></line>
|
||||
<line x1="14" y1="11" x2="14" y2="17"></line>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
</template>
|
||||
</div>
|
||||
</li>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { defineProps, defineEmits, computed } from 'vue';
|
||||
import type { PropType } from 'vue';
|
||||
import { useI18n } from 'vue-i18n';
|
||||
import type { Item } from '@/types/item';
|
||||
import VInput from '@/components/valerie/VInput.vue';
|
||||
import VSelect from '@/components/valerie/VSelect.vue';
|
||||
import { useOfflineStore } from '@/stores/offline';
|
||||
|
||||
interface ItemWithUI extends Item {
|
||||
updating: boolean;
|
||||
deleting: boolean;
|
||||
priceInput: string | number | null;
|
||||
swiped: boolean;
|
||||
isEditing?: boolean;
|
||||
editName?: string;
|
||||
editQuantity?: number | string | null;
|
||||
editCategoryId?: number | null;
|
||||
showFirework?: boolean;
|
||||
}
|
||||
|
||||
const props = defineProps({
|
||||
item: {
|
||||
type: Object as PropType<ItemWithUI>,
|
||||
required: true,
|
||||
},
|
||||
isOnline: {
|
||||
type: Boolean,
|
||||
required: true,
|
||||
},
|
||||
categoryOptions: {
|
||||
type: Array as PropType<{ label: string; value: number | null }[]>,
|
||||
required: true,
|
||||
},
|
||||
supermarktMode: {
|
||||
type: Boolean,
|
||||
default: false,
|
||||
},
|
||||
});
|
||||
|
||||
const emit = defineEmits([
|
||||
'delete-item',
|
||||
'checkbox-change',
|
||||
'update-price',
|
||||
'start-edit',
|
||||
'save-edit',
|
||||
'cancel-edit',
|
||||
'update:editName',
|
||||
'update:editQuantity',
|
||||
'update:editCategoryId',
|
||||
'update:priceInput'
|
||||
]);
|
||||
|
||||
const { t } = useI18n();
|
||||
const offlineStore = useOfflineStore();
|
||||
|
||||
const safeCategoryOptions = computed(() => props.categoryOptions.map(opt => ({
|
||||
...opt,
|
||||
value: opt.value === null ? '' : opt.value
|
||||
})));
|
||||
|
||||
const categoryModel = computed({
|
||||
get: () => props.item.editCategoryId === null || props.item.editCategoryId === undefined ? '' : props.item.editCategoryId,
|
||||
set: (value) => {
|
||||
emit('update:editCategoryId', value === '' ? null : value);
|
||||
}
|
||||
});
|
||||
|
||||
const isItemPendingSync = computed(() => {
|
||||
return offlineStore.pendingActions.some(action => {
|
||||
if (action.type === 'update_list_item' || action.type === 'delete_list_item') {
|
||||
const payload = action.payload as { listId: string; itemId: string };
|
||||
return payload.itemId === String(props.item.id);
|
||||
}
|
||||
return false;
|
||||
});
|
||||
});
|
||||
|
||||
const onPriceInput = (value: string | number) => {
|
||||
emit('update:priceInput', value);
|
||||
}
|
||||
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.neo-list-item {
|
||||
padding: 1rem 0;
|
||||
border-bottom: 1px solid #eee;
|
||||
transition: background-color 0.2s ease;
|
||||
}
|
||||
|
||||
.neo-list-item:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.neo-list-item:hover {
|
||||
background-color: #f8f8f8;
|
||||
}
|
||||
|
||||
@media (max-width: 600px) {
|
||||
.neo-list-item {
|
||||
padding: 0.75rem 1rem;
|
||||
}
|
||||
}
|
||||
|
||||
.item-pending-sync {
|
||||
/* You can add specific styling for pending items, e.g., a subtle glow or background */
|
||||
}
|
||||
|
||||
.neo-item-content {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
width: 100%;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.neo-item-actions {
|
||||
display: flex;
|
||||
gap: 0.25rem;
|
||||
opacity: 0;
|
||||
transition: opacity 0.2s ease;
|
||||
margin-left: auto;
|
||||
}
|
||||
|
||||
.neo-list-item:hover .neo-item-actions {
|
||||
opacity: 1;
|
||||
}
|
||||
|
||||
.inline-edit-form {
|
||||
display: flex;
|
||||
gap: 0.5rem;
|
||||
align-items: center;
|
||||
flex-grow: 1;
|
||||
}
|
||||
|
||||
.neo-icon-button {
|
||||
padding: 0.5rem;
|
||||
border-radius: 4px;
|
||||
color: #666;
|
||||
transition: all 0.2s ease;
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
background: transparent;
|
||||
border: none;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.neo-icon-button:hover {
|
||||
background: #f0f0f0;
|
||||
color: #333;
|
||||
}
|
||||
|
||||
.neo-edit-button {
|
||||
color: #3b82f6;
|
||||
}
|
||||
|
||||
.neo-edit-button:hover {
|
||||
background: #eef7fd;
|
||||
color: #2563eb;
|
||||
}
|
||||
|
||||
.neo-delete-button {
|
||||
color: #ef4444;
|
||||
}
|
||||
|
||||
.neo-delete-button:hover {
|
||||
background: #fee2e2;
|
||||
color: #dc2626;
|
||||
}
|
||||
|
||||
.neo-save-button {
|
||||
color: #22c55e;
|
||||
}
|
||||
|
||||
.neo-save-button:hover {
|
||||
background: #dcfce7;
|
||||
color: #16a34a;
|
||||
}
|
||||
|
||||
.neo-cancel-button {
|
||||
color: #ef4444;
|
||||
}
|
||||
|
||||
.neo-cancel-button:hover {
|
||||
background: #fee2e2;
|
||||
color: #dc2626;
|
||||
}
|
||||
|
||||
/* Custom Checkbox Styles */
|
||||
.neo-checkbox-label {
|
||||
display: grid;
|
||||
grid-template-columns: auto 1fr;
|
||||
align-items: center;
|
||||
gap: 0.8em;
|
||||
cursor: pointer;
|
||||
position: relative;
|
||||
width: 100%;
|
||||
font-weight: 500;
|
||||
color: #414856;
|
||||
transition: color 0.3s ease;
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"] {
|
||||
appearance: none;
|
||||
position: relative;
|
||||
height: 20px;
|
||||
width: 20px;
|
||||
outline: none;
|
||||
border: 2px solid #b8c1d1;
|
||||
margin: 0;
|
||||
cursor: pointer;
|
||||
background: transparent;
|
||||
border-radius: 6px;
|
||||
display: grid;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
transition: all 0.2s cubic-bezier(0.4, 0, 0.2, 1);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:hover {
|
||||
border-color: var(--secondary);
|
||||
transform: scale(1.05);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]::after {
|
||||
content: "";
|
||||
position: absolute;
|
||||
opacity: 0;
|
||||
left: 5px;
|
||||
top: 1px;
|
||||
width: 6px;
|
||||
height: 12px;
|
||||
border: solid var(--primary);
|
||||
border-width: 0 3px 3px 0;
|
||||
transform: rotate(45deg) scale(0);
|
||||
transition: all 0.2s cubic-bezier(0.18, 0.89, 0.32, 1.28);
|
||||
transition-property: transform, opacity;
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked {
|
||||
border-color: var(--primary);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked::after {
|
||||
opacity: 1;
|
||||
transform: rotate(45deg) scale(1);
|
||||
}
|
||||
|
||||
.checkbox-content {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.checkbox-text-span {
|
||||
position: relative;
|
||||
transition: color 0.4s ease, opacity 0.4s ease;
|
||||
width: fit-content;
|
||||
}
|
||||
|
||||
.checkbox-text-span::before {
|
||||
content: '';
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: -0.1em;
|
||||
right: -0.1em;
|
||||
height: 2px;
|
||||
background: var(--dark);
|
||||
transform: scaleX(0);
|
||||
transform-origin: right;
|
||||
transition: transform 0.4s cubic-bezier(0.77, 0, .18, 1);
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked~.checkbox-content .checkbox-text-span {
|
||||
color: var(--dark);
|
||||
opacity: 0.6;
|
||||
}
|
||||
|
||||
.neo-checkbox-label input[type="checkbox"]:checked~.checkbox-content .checkbox-text-span::before {
|
||||
transform: scaleX(1);
|
||||
transform-origin: left;
|
||||
transition: transform 0.4s cubic-bezier(0.77, 0, .18, 1) 0.1s;
|
||||
}
|
||||
|
||||
.neo-completed-static {
|
||||
color: var(--dark);
|
||||
opacity: 0.6;
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.neo-completed-static::before {
|
||||
content: '';
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: -0.1em;
|
||||
right: -0.1em;
|
||||
height: 2px;
|
||||
background: var(--dark);
|
||||
transform: scaleX(1);
|
||||
transform-origin: left;
|
||||
}
|
||||
|
||||
.neo-price-input {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
margin-left: 0.5rem;
|
||||
opacity: 0.7;
|
||||
transition: opacity 0.2s ease;
|
||||
}
|
||||
|
||||
.neo-list-item:hover .neo-price-input {
|
||||
opacity: 1;
|
||||
}
|
||||
|
||||
.drag-handle {
|
||||
cursor: grab;
|
||||
padding: 0.5rem;
|
||||
color: #666;
|
||||
opacity: 0;
|
||||
transition: opacity 0.2s ease;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.neo-list-item:hover .drag-handle {
|
||||
opacity: 0.5;
|
||||
}
|
||||
|
||||
.drag-handle:hover {
|
||||
opacity: 1 !important;
|
||||
color: #333;
|
||||
}
|
||||
|
||||
.drag-handle:active {
|
||||
cursor: grabbing;
|
||||
}
|
||||
|
||||
/* User Information Styles */
|
||||
.item-text-container {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 0.25rem;
|
||||
}
|
||||
|
||||
.item-user-info {
|
||||
display: flex;
|
||||
gap: 0.5rem;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.user-badge {
|
||||
font-size: 0.75rem;
|
||||
color: #6b7280;
|
||||
background: #f3f4f6;
|
||||
padding: 0.125rem 0.375rem;
|
||||
border-radius: 0.25rem;
|
||||
border: 1px solid #e5e7eb;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.user-badge.added-by {
|
||||
color: #059669;
|
||||
background: #ecfdf5;
|
||||
border-color: #a7f3d0;
|
||||
}
|
||||
|
||||
.user-badge.completed-by {
|
||||
color: #7c3aed;
|
||||
background: #f3e8ff;
|
||||
border-color: #c4b5fd;
|
||||
}
|
||||
</style>
|
114
fe/src/components/list-detail/OcrDialog.vue
Normal file
114
fe/src/components/list-detail/OcrDialog.vue
Normal file
@ -0,0 +1,114 @@
|
||||
<template>
|
||||
<VModal :model-value="modelValue" :title="$t('listDetailPage.modals.ocr.title')"
|
||||
@update:modelValue="$emit('update:modelValue', $event)">
|
||||
<template #default>
|
||||
<div v-if="ocrLoading" class="text-center">
|
||||
<VSpinner :label="$t('listDetailPage.loading.ocrProcessing')" />
|
||||
</div>
|
||||
<VList v-else-if="ocrItems.length > 0">
|
||||
<VListItem v-for="(ocrItem, index) in ocrItems" :key="index">
|
||||
<div class="flex items-center gap-2">
|
||||
<VInput type="text" v-model="ocrItem.name" class="flex-grow" required />
|
||||
<VButton variant="danger" size="sm" :icon-only="true" iconLeft="trash"
|
||||
@click="ocrItems.splice(index, 1)" />
|
||||
</div>
|
||||
</VListItem>
|
||||
</VList>
|
||||
<VFormField v-else :label="$t('listDetailPage.modals.ocr.uploadLabel')"
|
||||
:error-message="ocrError || undefined">
|
||||
<VInput type="file" id="ocrFile" accept="image/*" @change="handleOcrFileUpload" ref="ocrFileInputRef"
|
||||
:model-value="''" />
|
||||
</VFormField>
|
||||
</template>
|
||||
<template #footer>
|
||||
<VButton variant="neutral" @click="$emit('update:modelValue', false)">{{ $t('listDetailPage.buttons.cancel')
|
||||
}}</VButton>
|
||||
<VButton v-if="ocrItems.length > 0" type="button" variant="primary" @click="confirmAddItems"
|
||||
:disabled="isAdding">
|
||||
<VSpinner v-if="isAdding" size="sm" />
|
||||
{{ $t('listDetailPage.buttons.addItems') }}
|
||||
</VButton>
|
||||
</template>
|
||||
</VModal>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, watch, defineProps, defineEmits } from 'vue';
|
||||
import { useI18n } from 'vue-i18n';
|
||||
import { apiClient, API_ENDPOINTS } from '@/services/api';
|
||||
import { getApiErrorMessage } from '@/utils/errors';
|
||||
import VModal from '@/components/valerie/VModal.vue';
|
||||
import VSpinner from '@/components/valerie/VSpinner.vue';
|
||||
import VList from '@/components/valerie/VList.vue';
|
||||
import VListItem from '@/components/valerie/VListItem.vue';
|
||||
import VInput from '@/components/valerie/VInput.vue';
|
||||
import VButton from '@/components/valerie/VButton.vue';
|
||||
import VFormField from '@/components/valerie/VFormField.vue';
|
||||
|
||||
const props = defineProps({
|
||||
modelValue: {
|
||||
type: Boolean,
|
||||
required: true,
|
||||
},
|
||||
isAdding: {
|
||||
type: Boolean,
|
||||
default: false,
|
||||
},
|
||||
});
|
||||
|
||||
const emit = defineEmits(['update:modelValue', 'add-items']);
|
||||
|
||||
const { t } = useI18n();
|
||||
|
||||
const ocrLoading = ref(false);
|
||||
const ocrItems = ref<{ name: string }[]>([]);
|
||||
const ocrError = ref<string | null>(null);
|
||||
const ocrFileInputRef = ref<InstanceType<typeof VInput> | null>(null);
|
||||
|
||||
const handleOcrFileUpload = (event: Event) => {
|
||||
const target = event.target as HTMLInputElement;
|
||||
if (target.files && target.files.length > 0) {
|
||||
handleOcrUpload(target.files[0]);
|
||||
}
|
||||
};
|
||||
|
||||
const handleOcrUpload = async (file: File) => {
|
||||
if (!file) return;
|
||||
ocrLoading.value = true;
|
||||
ocrError.value = null;
|
||||
ocrItems.value = [];
|
||||
try {
|
||||
const formData = new FormData();
|
||||
formData.append('image_file', file);
|
||||
const response = await apiClient.post(API_ENDPOINTS.OCR.PROCESS, formData, {
|
||||
headers: { 'Content-Type': 'multipart/form-data' },
|
||||
});
|
||||
ocrItems.value = response.data.extracted_items
|
||||
.map((nameStr: string) => ({ name: nameStr.trim() }))
|
||||
.filter((item: { name: string }) => item.name);
|
||||
if (ocrItems.value.length === 0) {
|
||||
ocrError.value = t('listDetailPage.errors.ocrNoItems');
|
||||
}
|
||||
} catch (err) {
|
||||
ocrError.value = getApiErrorMessage(err, 'listDetailPage.errors.ocrFailed', t);
|
||||
} finally {
|
||||
ocrLoading.value = false;
|
||||
// Reset file input
|
||||
if (ocrFileInputRef.value?.$el) {
|
||||
const input = ocrFileInputRef.value.$el.querySelector ? ocrFileInputRef.value.$el.querySelector('input') : ocrFileInputRef.value.$el;
|
||||
if (input) input.value = '';
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const confirmAddItems = () => {
|
||||
emit('add-items', ocrItems.value);
|
||||
};
|
||||
|
||||
watch(() => props.modelValue, (newVal) => {
|
||||
if (newVal) {
|
||||
ocrItems.value = [];
|
||||
ocrError.value = null;
|
||||
}
|
||||
});
|
||||
</script>
|
73
fe/src/components/list-detail/SettleShareModal.vue
Normal file
73
fe/src/components/list-detail/SettleShareModal.vue
Normal file
@ -0,0 +1,73 @@
|
||||
<template>
|
||||
<VModal :model-value="modelValue" :title="$t('listDetailPage.modals.settleShare.title')"
|
||||
@update:modelValue="$emit('update:modelValue', false)" size="md">
|
||||
<template #default>
|
||||
<div v-if="isLoading" class="text-center">
|
||||
<VSpinner :label="$t('listDetailPage.loading.settlement')" />
|
||||
</div>
|
||||
<div v-else>
|
||||
<p>
|
||||
{{ $t('listDetailPage.modals.settleShare.settleAmountFor', { userName: userName }) }}
|
||||
</p>
|
||||
<VFormField :label="$t('listDetailPage.modals.settleShare.amountLabel')"
|
||||
:error-message="error || undefined">
|
||||
<VInput type="number" :model-value="amount" @update:modelValue="$emit('update:amount', $event)"
|
||||
required />
|
||||
</VFormField>
|
||||
</div>
|
||||
</template>
|
||||
<template #footer>
|
||||
<VButton variant="neutral" @click="$emit('update:modelValue', false)">
|
||||
{{ $t('listDetailPage.modals.settleShare.cancelButton') }}
|
||||
</VButton>
|
||||
<VButton variant="primary" @click="$emit('confirm')" :disabled="isLoading">
|
||||
{{ $t('listDetailPage.modals.settleShare.confirmButton') }}
|
||||
</VButton>
|
||||
</template>
|
||||
</VModal>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { computed, defineProps, defineEmits } from 'vue';
|
||||
import type { PropType } from 'vue';
|
||||
import { useI18n } from 'vue-i18n';
|
||||
import type { ExpenseSplit } from '@/types/expense';
|
||||
import VModal from '@/components/valerie/VModal.vue';
|
||||
import VSpinner from '@/components/valerie/VSpinner.vue';
|
||||
import VFormField from '@/components/valerie/VFormField.vue';
|
||||
import VInput from '@/components/valerie/VInput.vue';
|
||||
import VButton from '@/components/valerie/VButton.vue';
|
||||
|
||||
const props = defineProps({
|
||||
modelValue: {
|
||||
type: Boolean,
|
||||
required: true,
|
||||
},
|
||||
split: {
|
||||
type: Object as PropType<ExpenseSplit | null>,
|
||||
required: true,
|
||||
},
|
||||
amount: {
|
||||
type: String,
|
||||
required: true,
|
||||
},
|
||||
error: {
|
||||
type: String as PropType<string | null>,
|
||||
default: null,
|
||||
},
|
||||
isLoading: {
|
||||
type: Boolean,
|
||||
default: false,
|
||||
}
|
||||
});
|
||||
|
||||
defineEmits(['update:modelValue', 'update:amount', 'confirm']);
|
||||
|
||||
const { t } = useI18n();
|
||||
|
||||
const userName = computed(() => {
|
||||
if (!props.split) return '';
|
||||
return props.split.user?.name || props.split.user?.email || `User ID: ${props.split.user_id}`;
|
||||
});
|
||||
|
||||
</script>
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user