Compare commits

...

7 Commits

Author SHA1 Message Date
weli 63bd9cd8ff fix: add department_id to ReqClazz and ReqClazzRepeat structs
Rust Backend CI / check (push) Failing after 13m19s
Completes the department_id field addition across all Clazz-related
structs and initializers after schema migration added the column.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-03 09:45:57 +08:00
weli fa14a5ca8c perf: add GIN indexes on clazz JSONB columns and date-range indexes
- GIN indexes on students/teachers (jsonb_path_ops) for fast user lookups
- Composite index on clazz (is_repeat, start_from, end_by) for date range queries
- Indexes on clazz_repeat (clazz_id, repeat_start, repeat_end) for joins

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-03 09:39:36 +08:00
weli 7b05a91988 docs: vconsole is sudoer-only (SYS_CAN_SUDO tag)
Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-02 10:49:38 +08:00
weli 5861e646f4 docs: add E2E debugging principles to CLAUDE.md
Document vconsole visibility rules (SYS_TESTER label / SYS_CAN_SUDO tag),
existing debug data injection patterns, and the norm for inserting new
debug data in frontend (console.warn + localStorage), backend (tracing),
and database fixtures.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-02 10:38:12 +08:00
weli debff3a447 fix: use string_to_date in by_hty_id clazz queries
The by_hty_id variants incorrectly used string_to_datetime which expects
%Y-%m-%d %H:%M:%S format, but the API sends date-only %Y-%m-%d.
Switching to string_to_date fixes silent query failures.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-02 10:22:24 +08:00
weli be6734a23d docs: add e2e test workflow and prod smoke test to CLAUDE.md
Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-01 21:10:09 +08:00
weli 5bf43143ea feat: add audit logging and lesson statistics for clazz
- Add clazz_audit_log table with diesel migration for CRUD audit trail
- Add audit log backend: model, queries, handler, route
- Add audit log viewer in clazz detail modal (操作记录)
- Add student_lesson_stats API (GET /api/v1/clazz/stats/my-lessons)
- Add teacher_detail_stats API (GET /api/v1/clazz/stats/teacher-detail)
- Register all new routes in lib.rs

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-01 19:45:56 +08:00
18 changed files with 466 additions and 9 deletions
+43
View File
@@ -21,6 +21,49 @@
---
## E2E 测试流程(新功能开发标准流程)
为节约 CI 资源、加快迭代速度,新功能开发采用 **先本地验证,再推送 CI** 的流程:
1. **本地开发 & 编译**`cargo check` 确保无编译错误
2. **部署测试服 (moicen)**`git push` → SSH 登录 moicen → `git pull``cargo build --release`
3. **本地发起 e2e 测试** — 在本地运行 e2e 测试套件,请求指向 moicen 测试服
4. **验证通过后推送 CI**`git push origin master` → GitHub Actions 自动运行 `cargo check` + 全量 e2e
5. **部署正式服 (alchemy)** — SSH 登录 alchemy → `git pull``cargo build --release` → 重启服务
6. **正式服 CI smoke test** — 部署后立即触发 GitHub Actions 的 prod smoke test workflow,验证正式服 API 响应正常。若 smoke test 失败,立即回滚并排查
**Why:** 大部分问题在步骤 2-3 就能发现,避免反复触发 CI 浪费排队时间和 GitHub Actions 配额。
## E2E 调试原则
排查不了的 E2E 问题时,可以往前后端插入 debug 数据辅助排查。
### vConsole(小程序前端调试面板)
- **打开条件**:仅限有 `SYS_CAN_SUDO` tag 的 sudoer 用户 → 自动显示 vConsole 面板
- **实现位置**`huike-front/src/App.vue``watch(() => store.current)`
- **使用方式**:打开 vConsole 后,可在 Console 面板查看 `console.warn` / `console.log` 输出,Storage 面板查看 `localStorage` 数据
### 已有 debug 数据注入模式
项目中已存在以下 debug 注入模式,可直接复用:
| 模式 | 注入位置 | 查看方式 |
|------|----------|----------|
| `window.__cp_debug` | `pick.vue``add.vue`(课包相关) | vConsole Console 输入 `__cp_debug` |
| `localStorage.setItem("OrgSwitchDebug", ...)` | `store/org.ts` | vConsole Storage 面板 |
| `localStorage.setItem("ClazzPayloadDebug", ...)` | `store/clazz.ts` | vConsole Storage 面板 |
| `localStorage.setItem("CourseSectionPayloadDebug", ...)` | `store/qumu-section.ts` | vConsole Storage 面板 |
| `console.warn("[OrgSwitchDebug]", ...)` | `store/org.ts` | vConsole Console 面板 |
### 插入新 debug 数据的规范
**前端**:参考已有模式,用 `console.warn("[TagName]", data)` + `window.__tagName = data``localStorage.setItem("TagName", JSON.stringify(data))` 注入。vConsole 自动捕获 console 输出。
**后端(Rust**:用 `tracing::warn!("[TagName] {:?}", data)``tracing::info!(...)` 打印关键中间数据。日志会出现在服务启动的 nohup 输出(`/tmp/htykc.log` / `/tmp/htyuc.log`)以及 CI 的收尾日志步骤中。
**数据库**:可在 `huike-unit/ci/fixtures/` 下添加 SQL fixture 文件,插入特定测试数据。在 `e2e.yml` 中新增步骤运行即可。
## 部署原则
**禁止使用 scp/rsync 部署代码。** 所有部署必须走 GitHub push → 服务器 git pull 的流程。
+5 -2
View File
@@ -30,8 +30,8 @@ use crate::ws_course_package::{
};
use crate::ws_xiaoke::{batch_save_clazz_attendance, find_clazz_attendance_by_clazz_id};
use crate::ws_xiaoke::{
approve_clazz_leave, create_clazz_leave, list_clazz_leave, supervisor_teacher_stats,
teacher_hour_stats,
approve_clazz_leave, create_clazz_leave, list_clazz_leave, student_lesson_stats,
supervisor_teacher_stats, teacher_detail_stats, teacher_hour_stats,
};
#[debug_handler]
@@ -102,9 +102,12 @@ pub fn clazz_router(db_url: &str) -> Router {
)
.route("/api/v1/clazz/stats/my-hours", get(teacher_hour_stats))
.route("/api/v1/clazz/stats/org-teachers", get(supervisor_teacher_stats))
.route("/api/v1/clazz/stats/my-lessons", get(student_lesson_stats))
.route("/api/v1/clazz/stats/teacher-detail", get(teacher_detail_stats))
.route("/api/v1/clazz/leave/create", post(create_clazz_leave))
.route("/api/v1/clazz/leave/approve", post(approve_clazz_leave))
.route("/api/v1/clazz/leave/list", get(list_clazz_leave))
.route("/api/v1/clazz/audit-log/list", get(list_clazz_audit_log))
// 课包(course_package
.route("/api/v1/clazz/course-package/create", post(create_course_package))
.route("/api/v1/clazz/course-package/update", post(update_course_package))
+94 -6
View File
@@ -14,7 +14,7 @@ use htycommons::web::{
HtySudoerTokenHeader,
};
use htykc_models::models::{
Clazz, ClazzRepeat, ReqClazz, ReqClazzRepeat, ReqClazzWithRepeat,
Clazz, ClazzAuditLog, ClazzRepeat, NewClazzAuditLog, ReqClazz, ReqClazzRepeat, ReqClazzWithRepeat,
};
use std::collections::HashMap;
use std::ops::DerefMut;
@@ -27,6 +27,37 @@ fn current_org_id_from_auth(token: &AuthorizationHeader) -> Option<String> {
.and_then(|decoded| decoded.current_org_id)
}
fn hty_id_from_auth(token: &AuthorizationHeader) -> Option<String> {
jwt_decode_token(&(*token).clone())
.ok()
.and_then(|decoded| decoded.hty_id)
}
fn write_clazz_audit_log(
db_pool: &Arc<DbState>,
clazz_id: &str,
action: &str,
operator_hty_id: &str,
changes: Option<serde_json::Value>,
org_id: Option<String>,
) {
if let Ok(conn) = fetch_db_conn(db_pool) {
let _ = ClazzAuditLog::insert(
&NewClazzAuditLog {
id: uuid(),
clazz_id: clazz_id.to_string(),
action: action.to_string(),
operator_hty_id: operator_hty_id.to_string(),
operator_name: None,
changes,
created_at: current_local_datetime(),
org_id,
},
extract_conn(conn).deref_mut(),
);
}
}
pub async fn find_all_non_repeatable_within_date_range(
sudoer: HtySudoerTokenHeader,
host: HtyHostHeader,
@@ -147,8 +178,8 @@ fn raw_find_all_non_repeatable_within_date_range_by_hty_id(
params: &HashMap<String, String>,
) -> anyhow::Result<Vec<ReqClazz>> {
let start_from =
string_to_datetime(&get_some_from_query_params::<String>("start_from", &params))?;
let end_by = string_to_datetime(&get_some_from_query_params::<String>("end_by", &params))?;
string_to_date(&get_some_from_query_params::<String>("start_from", &params))?;
let end_by = string_to_date(&get_some_from_query_params::<String>("end_by", &params))?;
debug!(
"raw_find_all_non_repeatable_within_date_range_by_hty_id -> start_from: {:?}",
@@ -345,8 +376,8 @@ fn raw_find_all_repeatable_within_date_range_by_hty_id(
params: &HashMap<String, String>,
) -> anyhow::Result<Vec<ReqClazzWithRepeat>> {
let start_from =
string_to_datetime(&get_some_from_query_params::<String>("start_from", &params))?;
let end_by = string_to_datetime(&get_some_from_query_params::<String>("end_by", &params))?;
string_to_date(&get_some_from_query_params::<String>("start_from", &params))?;
let end_by = string_to_date(&get_some_from_query_params::<String>("end_by", &params))?;
debug!(
"raw_find_all_repeatable_within_date_range_by_hty_id -> start_from: {:?}",
@@ -666,6 +697,23 @@ async fn raw_update_clazz(
extract_conn(fetch_db_conn(&db_pool)?).deref_mut(),
)?;
// Audit log
let operator_id = hty_id_from_auth(&token).unwrap_or_default();
let action = if in_kecheng.is_delete == Some(true) { "DELETE" } else { "UPDATE" };
write_clazz_audit_log(
&db_pool,
&id_kecheng,
action,
&operator_id,
Some(serde_json::json!({
"is_delete": in_kecheng.is_delete,
"clazz_name": in_kecheng.clazz_name,
"start_from": in_kecheng.start_from,
"end_by": in_kecheng.end_by,
})),
current_org_id,
);
Ok(())
}
@@ -814,6 +862,7 @@ async fn raw_create_clazz_with_repeat(
is_notified: in_kecheng.is_notified.clone(),
completed_at: None,
org_id: in_kecheng.org_id.clone(),
department_id: in_kecheng.department_id.clone(),
};
if to_create_kecheng.org_id.is_none() {
to_create_kecheng.org_id = current_org_id.clone();
@@ -839,6 +888,7 @@ async fn raw_create_clazz_with_repeat(
repeat_status: kecheng_repeat_copy.repeat_status.clone(),
latest_clazz_created_at: kecheng_repeat_copy.latest_clazz_created_at.clone(),
org_id: kecheng_repeat_copy.org_id.clone(),
department_id: kecheng_repeat_copy.department_id.clone(),
})
}
@@ -848,11 +898,24 @@ async fn raw_create_clazz_with_repeat(
(to_create_kecheng, to_create_kecheng_repeat),
);
let out_result = raw_create_clazz_with_repeat_tx(params, db_pool);
let out_result = raw_create_clazz_with_repeat_tx(params, db_pool.clone());
match out_result {
Ok(out) => {
debug!("created_kecheng: {:?}", out);
write_clazz_audit_log(
&db_pool,
&new_clazz_id,
"CREATE",
&id_user,
Some(serde_json::json!({
"clazz_name": in_kecheng.clazz_name,
"start_from": in_kecheng.start_from,
"end_by": in_kecheng.end_by,
"is_repeat": in_kecheng.is_repeat,
})),
current_org_id.clone(),
);
Ok(out)
}
Err(e) => Err(anyhow!(HtyErr {
@@ -862,6 +925,30 @@ async fn raw_create_clazz_with_repeat(
}
}
pub async fn list_clazz_audit_log(
_sudoer: HtySudoerTokenHeader,
_host: HtyHostHeader,
_auth: AuthorizationHeader,
State(db_pool): State<Arc<DbState>>,
Query(params): Query<HashMap<String, String>>,
) -> Json<HtyResponse<Vec<ClazzAuditLog>>> {
let result = (|| -> anyhow::Result<Vec<ClazzAuditLog>> {
let clazz_id = get_some_from_query_params::<String>("clazz_id", &params)
.ok_or_else(|| anyhow!("clazz_id is required"))?;
ClazzAuditLog::list_by_clazz_id(
&clazz_id,
extract_conn(fetch_db_conn(&db_pool)?).deref_mut(),
)
})();
match result {
Ok(ok) => wrap_json_ok_resp(ok),
Err(e) => {
error!("list_clazz_audit_log -> failed, e: {}", e);
wrap_json_anyhow_err(e)
}
}
}
#[cfg(test)]
mod tests {
use htycommons::common::current_local_datetime;
@@ -881,6 +968,7 @@ mod tests {
tags: None,
current_org_id: current_org_id.map(|value| value.to_string()),
current_org_role_keys: Some(vec!["TEACHER".to_string()]),
current_department_id: None,
};
jwt_encode_token(token).expect("encode test token")
}
+1
View File
@@ -72,6 +72,7 @@ async fn raw_create_clazz_repeat(
repeat_status: in_kecheng_repeat.repeat_status.clone(),
latest_clazz_created_at: in_kecheng_repeat.latest_clazz_created_at.clone(),
org_id: in_kecheng_repeat.org_id.clone(),
department_id: in_kecheng_repeat.department_id.clone(),
};
let created_kecheng_repeat_result = ClazzRepeat::create(
+61 -1
View File
@@ -10,7 +10,8 @@ use htycommons::web::{
HtySudoerTokenHeader,
};
use htykc_models::models::{
Clazz, ClazzAttendance, ClazzLeaveRequest, ReqBatchClazzAttendance, ReqClazzLeaveRequest, TeacherHourStatsRow,
Clazz, ClazzAttendance, ClazzLeaveRequest, ReqBatchClazzAttendance, ReqClazzLeaveRequest,
StudentLessonRow, TeacherDetailStatsRow, TeacherHourStatsRow,
};
use htycommons::uuid;
use std::collections::HashMap;
@@ -175,6 +176,64 @@ pub async fn supervisor_teacher_stats(
}
}
pub async fn student_lesson_stats(
_sudoer: HtySudoerTokenHeader,
_host: HtyHostHeader,
auth: AuthorizationHeader,
State(db_pool): State<Arc<DbState>>,
Query(params): Query<HashMap<String, String>>,
) -> Json<HtyResponse<Vec<StudentLessonRow>>> {
let result = (|| -> anyhow::Result<Vec<StudentLessonRow>> {
let token = jwt_decode_token(&(*auth).clone())?;
let student_id = token.hty_id.ok_or_else(|| anyhow!("hty_id is required"))?;
let org_id = required_org_id_from_auth(&auth)?;
let start_date = get_some_from_query_params::<String>("start_date", &params)
.unwrap_or_else(|| "1970-01-01 00:00:00".to_string());
let end_date = get_some_from_query_params::<String>("end_date", &params)
.unwrap_or_else(|| "2999-12-31 23:59:59".to_string());
Clazz::student_lesson_stats_by_org_and_date_range(
&org_id,
&start_date,
&end_date,
&student_id,
extract_conn(fetch_db_conn(&db_pool)?).deref_mut(),
)
})();
match result {
Ok(ok) => wrap_json_ok_resp(ok),
Err(e) => wrap_json_anyhow_err(e),
}
}
pub async fn teacher_detail_stats(
_sudoer: HtySudoerTokenHeader,
_host: HtyHostHeader,
auth: AuthorizationHeader,
State(db_pool): State<Arc<DbState>>,
Query(params): Query<HashMap<String, String>>,
) -> Json<HtyResponse<Vec<TeacherDetailStatsRow>>> {
let result = (|| -> anyhow::Result<Vec<TeacherDetailStatsRow>> {
let token = jwt_decode_token(&(*auth).clone())?;
let teacher_id = token.hty_id.ok_or_else(|| anyhow!("hty_id is required"))?;
let org_id = required_org_id_from_auth(&auth)?;
let start_date = get_some_from_query_params::<String>("start_date", &params)
.unwrap_or_else(|| "1970-01-01 00:00:00".to_string());
let end_date = get_some_from_query_params::<String>("end_date", &params)
.unwrap_or_else(|| "2999-12-31 23:59:59".to_string());
Clazz::teacher_detail_stats_by_org_and_date_range(
&org_id,
&start_date,
&end_date,
&teacher_id,
extract_conn(fetch_db_conn(&db_pool)?).deref_mut(),
)
})();
match result {
Ok(ok) => wrap_json_ok_resp(ok),
Err(e) => wrap_json_anyhow_err(e),
}
}
pub async fn create_clazz_leave(
_sudoer: HtySudoerTokenHeader,
_host: HtyHostHeader,
@@ -278,6 +337,7 @@ mod tests {
tags: None,
current_org_id: current_org_id.map(|value| value.to_string()),
current_org_role_keys: Some(vec!["TEACHER".to_string()]),
current_department_id: None,
};
AuthorizationHeader(jwt_encode_token(token).expect("encode test token"))
}
+1
View File
@@ -33,6 +33,7 @@ fn build_test_token(with_org_context: bool) -> String {
None
},
current_org_role_keys: Some(vec!["TEACHER".to_string()]),
current_department_id: None,
};
jwt_encode_token(token).expect("encode test token")
}
@@ -0,0 +1 @@
DROP TABLE IF EXISTS clazz_audit_log;
@@ -0,0 +1,13 @@
CREATE TABLE clazz_audit_log (
id VARCHAR PRIMARY KEY,
clazz_id VARCHAR NOT NULL,
action VARCHAR NOT NULL,
operator_hty_id VARCHAR NOT NULL,
operator_name VARCHAR,
changes JSONB,
created_at TIMESTAMP NOT NULL DEFAULT NOW(),
org_id VARCHAR
);
CREATE INDEX idx_clazz_audit_log_clazz_id ON clazz_audit_log (clazz_id);
CREATE INDEX idx_clazz_audit_log_created_at ON clazz_audit_log (created_at);
@@ -0,0 +1,11 @@
DROP INDEX IF EXISTS idx_clazz_department_id;
DROP INDEX IF EXISTS idx_clazz_attendance_department_id;
DROP INDEX IF EXISTS idx_course_hour_package_department_id;
DROP INDEX IF EXISTS idx_hour_transaction_department_id;
DROP INDEX IF EXISTS idx_clazz_repeat_department_id;
ALTER TABLE clazz_repeat DROP COLUMN IF EXISTS department_id;
ALTER TABLE hour_transaction DROP COLUMN IF EXISTS department_id;
ALTER TABLE course_hour_package DROP COLUMN IF EXISTS department_id;
ALTER TABLE clazz_attendance DROP COLUMN IF EXISTS department_id;
ALTER TABLE clazz DROP COLUMN IF EXISTS department_id;
@@ -0,0 +1,12 @@
-- Add department_id to KC teaching tables (nullable first, backfill, then not-null in future)
ALTER TABLE clazz ADD COLUMN department_id VARCHAR;
ALTER TABLE clazz_attendance ADD COLUMN department_id VARCHAR;
ALTER TABLE course_hour_package ADD COLUMN department_id VARCHAR;
ALTER TABLE hour_transaction ADD COLUMN department_id VARCHAR;
ALTER TABLE clazz_repeat ADD COLUMN department_id VARCHAR;
CREATE INDEX idx_clazz_department_id ON clazz (department_id);
CREATE INDEX idx_clazz_attendance_department_id ON clazz_attendance (department_id);
CREATE INDEX idx_course_hour_package_department_id ON course_hour_package (department_id);
CREATE INDEX idx_hour_transaction_department_id ON hour_transaction (department_id);
CREATE INDEX idx_clazz_repeat_department_id ON clazz_repeat (department_id);
@@ -0,0 +1,5 @@
DROP INDEX IF EXISTS idx_clazz_students_gin;
DROP INDEX IF EXISTS idx_clazz_teachers_gin;
DROP INDEX IF EXISTS idx_clazz_date_active;
DROP INDEX IF EXISTS idx_clazz_repeat_clazz_id;
DROP INDEX IF EXISTS idx_clazz_repeat_date_range;
@@ -0,0 +1,10 @@
-- GIN indexes for JSONB path queries on students/teachers
CREATE INDEX IF NOT EXISTS idx_clazz_students_gin ON clazz USING gin (students jsonb_path_ops);
CREATE INDEX IF NOT EXISTS idx_clazz_teachers_gin ON clazz USING gin (teachers jsonb_path_ops);
-- Composite index for date range + is_repeat filter
CREATE INDEX IF NOT EXISTS idx_clazz_date_active ON clazz (is_repeat, start_from, end_by);
-- Indexes on clazz_repeat for join queries
CREATE INDEX IF NOT EXISTS idx_clazz_repeat_clazz_id ON clazz_repeat (clazz_id);
CREATE INDEX IF NOT EXISTS idx_clazz_repeat_date_range ON clazz_repeat (repeat_start, repeat_end);
+181
View File
@@ -52,6 +52,7 @@ pub struct Clazz {
pub is_notified: Option<bool>,
pub completed_at: Option<NaiveDateTime>,
pub org_id: Option<String>,
pub department_id: Option<String>,
}
// https://weinan.io/2024/02/23/rust-diesel.html
@@ -152,6 +153,7 @@ impl ReqClazzWithRepeat {
is_notified: None,
completed_at: None,
org_id: None,
department_id: None,
}
};
@@ -169,6 +171,7 @@ impl ReqClazzWithRepeat {
repeat_status: None,
latest_clazz_created_at: None,
org_id: None,
department_id: None,
}
};
@@ -232,6 +235,7 @@ impl Clazz {
is_notified: self.is_notified.clone(),
completed_at: self.completed_at.clone(),
org_id: self.org_id.clone(),
department_id: self.department_id.clone(),
};
req_res
}
@@ -453,6 +457,7 @@ pub struct ClazzRepeat {
pub repeat_status: Option<String>,
pub latest_clazz_created_at: Option<NaiveDateTime>,
pub org_id: Option<String>,
pub department_id: Option<String>,
}
impl ClazzRepeat {
@@ -485,6 +490,7 @@ impl ClazzRepeat {
repeat_status: self.repeat_status.clone(),
latest_clazz_created_at: self.latest_clazz_created_at.clone(),
org_id: self.org_id.clone(),
department_id: self.department_id.clone(),
};
req_repeat
}
@@ -619,6 +625,7 @@ pub struct ReqClazz {
pub is_notified: Option<bool>,
pub completed_at: Option<NaiveDateTime>,
pub org_id: Option<String>,
pub department_id: Option<String>,
}
impl ReqClazz {
@@ -652,6 +659,7 @@ impl ReqClazz {
is_notified: None,
completed_at: None,
org_id: None,
department_id: None,
}
};
@@ -689,6 +697,7 @@ impl ReqClazz {
is_notified: the_kecheng.is_notified,
completed_at: the_kecheng.completed_at,
org_id: the_kecheng.org_id,
department_id: the_kecheng.department_id.clone(),
}
}
}
@@ -705,6 +714,7 @@ pub struct ReqClazzRepeat {
pub repeat_status: Option<String>,
pub latest_clazz_created_at: Option<NaiveDateTime>,
pub org_id: Option<String>,
pub department_id: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
@@ -1167,6 +1177,7 @@ pub struct ClazzAttendance {
pub created_by: Option<String>,
pub is_delete: Option<bool>,
pub org_id: Option<String>,
pub department_id: Option<String>,
}
#[derive(Insertable, Clone, Debug)]
@@ -1184,6 +1195,7 @@ pub struct NewClazzAttendance {
pub created_by: Option<String>,
pub is_delete: Option<bool>,
pub org_id: Option<String>,
pub department_id: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
@@ -1255,6 +1267,124 @@ pub struct TeacherHourStatsRow {
pub total_hours: f64,
}
#[derive(QueryableByName, Serialize, Deserialize, Debug, Clone)]
pub struct StudentLessonRow {
#[diesel(sql_type = diesel::sql_types::Varchar)]
pub clazz_id: String,
#[diesel(sql_type = diesel::sql_types::Varchar)]
pub clazz_name: String,
#[diesel(sql_type = diesel::sql_types::Timestamp)]
pub start_from: NaiveDateTime,
#[diesel(sql_type = diesel::sql_types::Timestamp)]
pub end_by: NaiveDateTime,
#[diesel(sql_type = diesel::sql_types::Nullable<diesel::sql_types::Varchar>)]
pub teacher_names: Option<String>,
#[diesel(sql_type = diesel::sql_types::Nullable<diesel::sql_types::Varchar>)]
pub attendance_status: Option<String>,
#[diesel(sql_type = diesel::sql_types::Nullable<diesel::sql_types::Double>)]
pub deducted_hours: Option<f64>,
#[diesel(sql_type = diesel::sql_types::Nullable<diesel::sql_types::Varchar>)]
pub leave_status: Option<String>,
}
#[derive(QueryableByName, Serialize, Deserialize, Debug, Clone)]
pub struct TeacherDetailStatsRow {
#[diesel(sql_type = diesel::sql_types::Varchar)]
pub student_id: String,
#[diesel(sql_type = diesel::sql_types::Nullable<diesel::sql_types::Varchar>)]
pub student_name: Option<String>,
#[diesel(sql_type = diesel::sql_types::BigInt)]
pub total_classes: i64,
#[diesel(sql_type = diesel::sql_types::BigInt)]
pub attended: i64,
#[diesel(sql_type = diesel::sql_types::BigInt)]
pub absent: i64,
#[diesel(sql_type = diesel::sql_types::Double)]
pub total_hours: f64,
#[diesel(sql_type = diesel::sql_types::BigInt)]
pub leave_count: i64,
}
impl Clazz {
pub fn student_lesson_stats_by_org_and_date_range(
org: &String,
start_date: &String,
end_date: &String,
student_id: &String,
conn: &mut PgConnection,
) -> anyhow::Result<Vec<StudentLessonRow>> {
let query_text = format!(
"select c.id as clazz_id,
c.clazz_name,
c.start_from,
c.end_by,
(select string_agg(elem->>'real_name', ', ')
from jsonb_array_elements(c.teachers#>'{{val,users,vals}}') elem) as teacher_names,
a.status as attendance_status,
a.deducted_hours,
lr.request_status as leave_status
from clazz c
left join clazz_attendance a on c.id = a.clazz_id and a.student_id = '{sid}' and (a.is_delete is null or a.is_delete = false)
left join clazz_leave_request lr on c.id = lr.clazz_id and lr.student_id = '{sid}'
where c.org_id = '{org}'
and c.start_from >= '{start}'
and c.end_by <= '{end}'
and (c.is_delete is null or c.is_delete = false)
and c.students @> '{{\"val\": {{\"users\": {{\"vals\": [{{\"user_id\": \"{sid}\"}}]}}}}}}'::jsonb
order by c.start_from desc",
sid = student_id,
org = org,
start = start_date,
end = end_date,
);
sql_query(query_text)
.load::<StudentLessonRow>(conn)
.map_err(|e| anyhow!(HtyErr {
code: HtyErrCode::DbErr,
reason: Some(e.to_string()),
}))
}
pub fn teacher_detail_stats_by_org_and_date_range(
org: &String,
start_date: &String,
end_date: &String,
teacher_id: &String,
conn: &mut PgConnection,
) -> anyhow::Result<Vec<TeacherDetailStatsRow>> {
let query_text = format!(
"select s_el->>'user_id' as student_id,
s_el->>'real_name' as student_name,
count(distinct c.id)::bigint as total_classes,
count(distinct a.id) filter (where a.status = 'NORMAL')::bigint as attended,
count(distinct a.id) filter (where a.status = 'ABSENT')::bigint as absent,
coalesce(sum(a.deducted_hours), 0)::double precision as total_hours,
count(distinct lr.id)::bigint as leave_count
from clazz c
cross join lateral jsonb_array_elements(c.students#>'{{val,users,vals}}') s_el
left join clazz_attendance a on c.id = a.clazz_id and a.student_id = s_el->>'user_id' and (a.is_delete is null or a.is_delete = false)
left join clazz_leave_request lr on c.id = lr.clazz_id and lr.student_id = s_el->>'user_id'
where c.org_id = '{org}'
and c.start_from >= '{start}'
and c.end_by <= '{end}'
and (c.is_delete is null or c.is_delete = false)
and c.created_by = '{tid}'
group by s_el->>'user_id', s_el->>'real_name'
order by student_name",
org = org,
start = start_date,
end = end_date,
tid = teacher_id,
);
sql_query(query_text)
.load::<TeacherDetailStatsRow>(conn)
.map_err(|e| anyhow!(HtyErr {
code: HtyErrCode::DbErr,
reason: Some(e.to_string()),
}))
}
}
impl ClazzLeaveRequest {
pub fn create(
payload: &ClazzLeaveRequest,
@@ -1406,6 +1536,7 @@ impl ClazzAttendance {
created_by: operator_id.clone(),
is_delete: Some(false),
org_id: None,
department_id: None,
})
.collect();
@@ -1434,3 +1565,53 @@ impl ClazzAttendance {
})
}
}
#[derive(Serialize, Deserialize, Debug, Clone, Queryable, Insertable, PartialEq)]
#[diesel(table_name = clazz_audit_log)]
pub struct ClazzAuditLog {
pub id: String,
pub clazz_id: String,
pub action: String,
pub operator_hty_id: String,
pub operator_name: Option<String>,
pub changes: Option<serde_json::Value>,
pub created_at: NaiveDateTime,
pub org_id: Option<String>,
}
#[derive(Insertable)]
#[diesel(table_name = clazz_audit_log)]
pub struct NewClazzAuditLog {
pub id: String,
pub clazz_id: String,
pub action: String,
pub operator_hty_id: String,
pub operator_name: Option<String>,
pub changes: Option<serde_json::Value>,
pub created_at: NaiveDateTime,
pub org_id: Option<String>,
}
impl ClazzAuditLog {
pub fn insert(
log: &NewClazzAuditLog,
conn: &mut PgConnection,
) -> anyhow::Result<()> {
use crate::schema::clazz_audit_log::dsl::*;
insert_into(clazz_audit_log)
.values(log)
.execute(conn)?;
Ok(())
}
pub fn list_by_clazz_id(
clazz_id_in: &String,
conn: &mut PgConnection,
) -> anyhow::Result<Vec<ClazzAuditLog>> {
use crate::schema::clazz_audit_log::dsl::*;
Ok(clazz_audit_log
.filter(clazz_id.eq(clazz_id_in))
.order(created_at.desc())
.load::<ClazzAuditLog>(conn)?)
}
}
+19
View File
@@ -24,6 +24,7 @@ diesel::table! {
is_notified -> Nullable<Bool>,
completed_at -> Nullable<Timestamp>,
org_id -> Nullable<Varchar>,
department_id -> Nullable<Varchar>,
}
}
@@ -41,6 +42,20 @@ diesel::table! {
created_by -> Nullable<Varchar>,
is_delete -> Nullable<Bool>,
org_id -> Nullable<Varchar>,
department_id -> Nullable<Varchar>,
}
}
diesel::table! {
clazz_audit_log (id) {
id -> Varchar,
clazz_id -> Varchar,
action -> Varchar,
operator_hty_id -> Varchar,
operator_name -> Nullable<Varchar>,
changes -> Nullable<Jsonb>,
created_at -> Timestamp,
org_id -> Nullable<Varchar>,
}
}
@@ -74,6 +89,7 @@ diesel::table! {
repeat_status -> Nullable<Varchar>,
latest_clazz_created_at -> Nullable<Timestamp>,
org_id -> Nullable<Varchar>,
department_id -> Nullable<Varchar>,
}
}
@@ -92,6 +108,7 @@ diesel::table! {
updated_at -> Nullable<Timestamp>,
is_delete -> Nullable<Bool>,
org_id -> Nullable<Varchar>,
department_id -> Nullable<Varchar>,
}
}
@@ -142,6 +159,7 @@ diesel::table! {
remark -> Nullable<Varchar>,
created_at -> Timestamp,
org_id -> Nullable<Varchar>,
department_id -> Nullable<Varchar>,
}
}
@@ -155,6 +173,7 @@ diesel::joinable!(hour_transaction -> course_hour_package (package_id));
diesel::allow_tables_to_appear_in_same_query!(
clazz,
clazz_attendance,
clazz_audit_log,
clazz_leave_request,
clazz_repeat,
course_hour_package,
+1
View File
@@ -52,6 +52,7 @@ fn mint_sudo_jwt() -> String {
tags: None,
current_org_id: None,
current_org_role_keys: None,
current_department_id: None,
};
jwt_encode_token(inner).expect("jwt_encode_token")
}
+1
View File
@@ -4358,6 +4358,7 @@ mod tests {
tags: None,
current_org_id: current_org_id.map(|value| value.to_string()),
current_org_role_keys: Some(vec!["TEACHER".to_string()]),
current_department_id: None,
};
jwt_encode_token(token).expect("encode test token")
}
@@ -0,0 +1,3 @@
DROP INDEX IF EXISTS idx_teacher_student_department_id;
ALTER TABLE teacher_student DROP COLUMN IF EXISTS department_id;
@@ -0,0 +1,4 @@
-- Add department_id to teacher_student (nullable first, backfill, then not-null in future)
ALTER TABLE teacher_student ADD COLUMN department_id VARCHAR;
CREATE INDEX idx_teacher_student_department_id ON teacher_student (department_id);