Merge pull request 'release: v1.1.0 - 동기화 현황 메뉴 + 배치 리팩토링 + CI/CD' (#15) from develop into main
All checks were successful
Build and Deploy SNP Sync Batch / build-and-deploy (push) Successful in 29s

This commit is contained in:
HYOJIN 2026-03-25 11:24:25 +09:00
커밋 3cfa1fe925
106개의 변경된 파일3212개의 추가작업 그리고 3488개의 파일을 삭제

파일 보기

@ -26,7 +26,9 @@
"Bash(git show *)",
"Bash(git tag *)",
"Bash(curl -s *)",
"Bash(sdk *)"
"Bash(sdk *)",
"Bash(chmod +x *)",
"Bash(bash .claude/scripts/*)"
],
"deny": [
"Bash(git push --force*)",
@ -46,5 +48,42 @@
"Read(./**/application-local.yml)",
"Read(./**/application-local.properties)"
]
},
"hooks": {
"SessionStart": [
{
"matcher": "compact",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-post-compact.sh",
"timeout": 10
}
]
}
],
"PreCompact": [
{
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-pre-compact.sh",
"timeout": 30
}
]
}
],
"PostToolUse": [
{
"matcher": "Bash",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-commit.sh",
"timeout": 15
}
]
}
]
}
}

파일 보기

@ -0,0 +1,6 @@
{
"applied_global_version": "1.6.1",
"applied_date": "2026-03-23",
"project_type": "java-maven",
"gitea_url": "https://gitea.gc-si.dev"
}

파일 보기

@ -0,0 +1,49 @@
name: Build and Deploy SNP Sync Batch
on:
push:
branches:
- main
jobs:
build-and-deploy:
runs-on: ubuntu-latest
container:
image: maven:3.9-eclipse-temurin-17
steps:
- name: Checkout
run: |
git clone --depth=1 --branch=${GITHUB_REF_NAME} \
http://gitea:3000/${GITHUB_REPOSITORY}.git .
- name: Configure Maven settings
run: |
mkdir -p ~/.m2
cat > ~/.m2/settings.xml << 'SETTINGS'
<settings>
<mirrors>
<mirror>
<id>nexus</id>
<mirrorOf>*</mirrorOf>
<url>https://nexus.gc-si.dev/repository/maven-public/</url>
</mirror>
</mirrors>
<servers>
<server>
<id>nexus</id>
<username>${{ secrets.NEXUS_USERNAME }}</username>
<password>${{ secrets.NEXUS_PASSWORD }}</password>
</server>
</servers>
</settings>
SETTINGS
- name: Build
run: mvn clean package -DskipTests -B
- name: Deploy
run: |
cp target/snp-sync-batch-*.jar /deploy/snp-sync-batch/app.jar
date '+%Y-%m-%d %H:%M:%S' > /deploy/snp-sync-batch/.deploy-trigger
echo "Deployed at $(cat /deploy/snp-sync-batch/.deploy-trigger)"
ls -la /deploy/snp-sync-batch/

0
.githooks/commit-msg Normal file → Executable file
파일 보기

0
.githooks/post-checkout Normal file → Executable file
파일 보기

0
.githooks/pre-commit Normal file → Executable file
파일 보기

1
.gitignore vendored
파일 보기

@ -71,3 +71,4 @@ application-local.yml
frontend/node/
frontend/node_modules/
src/main/resources/static/
logs/

파일 보기

@ -1,17 +1,21 @@
# 프로젝트 개요
- **타입**: Java + Spring Boot + Maven
- **JDK**: 17 (`.sdkmanrc` 참조)
- **프레임워크**: Spring Boot
- **프로젝트명**: SNP Sync Batch
- **타입**: Java + Spring Boot + Spring Batch + Maven
- **설명**: S&P Global 해양 데이터를 API로 수집하여 PostgreSQL에 동기화하는 배치 시스템 (Web GUI 포함)
- **JDK**: 17 (`.sdkmanrc` 참조, 17.0.18-amzn)
- **프레임워크**: Spring Boot 3.2.1, Spring Batch 5.1.0, Quartz 2.5.0
- **빌드 도구**: Maven (Maven Wrapper 사용)
- **DB**: PostgreSQL (dual datasource: batch-meta + business)
- **프론트엔드**: React (frontend-maven-plugin으로 빌드 통합)
## 빌드 및 실행
```bash
# 빌드
./mvnw clean compile
# 빌드 (백엔드만)
./mvnw clean compile -DskipTests
# 패키징
# 전체 패키징 (프론트엔드 포함)
./mvnw clean package -DskipTests
# 테스트
@ -20,7 +24,7 @@
# 특정 테스트 클래스 실행
./mvnw test -Dtest=클래스명
# 로컬 실행
# 로컬 실행 (포트 8051, context-path: /snp-sync)
./mvnw spring-boot:run
# 린트 (Checkstyle 설정된 경우)
@ -33,21 +37,48 @@
src/
├── main/
│ ├── java/
│ │ └── com/gcsc/{프로젝트}/
│ │ ├── config/ # 설정 클래스
│ │ ├── controller/ # REST 컨트롤러
│ │ ├── service/ # 비즈니스 로직
│ │ ├── repository/ # 데이터 접근
│ │ ├── domain/ # 엔티티
│ │ ├── dto/ # 데이터 전송 객체
│ │ ├── exception/ # 예외 처리
│ │ └── util/ # 유틸리티
│ │ └── com/snp/batch/
│ │ ├── SnpBatchApplication.java # 메인 클래스
│ │ ├── common/
│ │ │ ├── batch/ # 배치 공통 (Base 클래스들)
│ │ │ │ ├── config/ # BaseJobConfig
│ │ │ │ ├── entity/ # BaseEntity
│ │ │ │ ├── processor/ # BaseProcessor
│ │ │ │ ├── reader/ # BaseApiReader
│ │ │ │ ├── repository/ # BaseJdbcRepository
│ │ │ │ └── writer/ # BaseWriter, BaseChunkedWriter
│ │ │ ├── util/ # 유틸리티 (EntityUtils, CommonSql 등)
│ │ │ └── web/ # 웹 공통 (BaseController, BaseService)
│ │ ├── jobs/datasync/batch/ # 동기화 배치 Job 모듈
│ │ │ ├── code/ # 코드 동기화 (Stat5Code, FlagCode)
│ │ │ ├── compliance/ # 컴플라이언스 동기화
│ │ │ ├── facility/ # 시설/항구 동기화
│ │ │ ├── movement/ # 선박 이동 동기화
│ │ │ ├── psc/ # PSC 검사 동기화
│ │ │ └── ... # ship, event, risk 등
│ │ └── scheduler/ # Quartz 스케줄러
│ └── resources/
│ ├── application.yml # 공통 설정
│ ├── application-local.yml # 로컬 설정 (.gitignore)
│ └── application-prod.yml # 운영 설정
└── test/
└── java/ # 테스트 코드
│ ├── application.yml # 공통 설정
│ ├── application-dev.yml # 개발 설정
│ ├── application-prod.yml # 운영 설정
│ └── application-local.yml # 로컬 설정 (.gitignore)
├── test/
│ └── java/ # 테스트 코드
└── frontend/ # React 프론트엔드
```
## 배치 Job 구조 패턴
각 도메인 모듈은 동일한 구조를 따름:
```
jobs/datasync/batch/{도메인}/
├── config/ # JobConfig (Step, Job 정의)
├── dto/ # API 응답 DTO
├── entity/ # DB Entity
├── reader/ # ItemReader (API 호출)
├── processor/ # ItemProcessor (DTO → Entity 변환)
├── repository/ # JdbcRepository (SQL 직접 사용)
└── writer/ # ItemWriter (DB 저장)
```
## 팀 규칙

28
docs/RELEASE-NOTES.md Normal file
파일 보기

@ -0,0 +1,28 @@
# Release Notes
이 문서는 [Keep a Changelog](https://keepachangelog.com/ko/1.0.0/) 형식을 따릅니다.
## [Unreleased]
### 추가
- 동기화 현황 메뉴 추가: 도메인 탭 + 테이블 아코디언 + 인라인 데이터 조회 (#1)
- SyncStatusService: batch_flag 기반 테이블별 N/P/S 집계 (병렬 조회)
- P 상태 고착 레코드 조회 및 P→N 리셋 기능
- abandon/stale 실행 관리 엔드포인트 구현 (#7)
### 변경
- 동기화 현황 노출 테이블 목록 정리: ship-001/ship-002 제거, source-target 키 매핑 정리 (#11)
- BaseSyncReader 추출: 49개 Reader 공통 로직 통합, 1 chunk = 1 job_execution_id 보장
- chunk 경계 제어를 GroupByExecutionIdPolicy에서 Reader 자체 제어로 변경
- BatchWriteListener: SQL을 실행 시점에 생성하여 SOURCE_SCHEMA null 문제 해결
### 수정
- batch_flag P 상태 고착 버그 수정 (Reader의 N→P 전환 시점 분리)
- BatchWriteListener SQL null 참조 수정 (빈 생성 시 → 실행 시 지연 생성)
- 스케줄 toggle API 메서드 불일치 수정: POST → PATCH (#6)
### 기타
- Gitea Actions 자동배포 워크플로우 구성 (#12)
- .gitignore에 logs/ 추가
- application-dev.yml chunk-size, sub-chunk-size 설정 추가
- Repository 배치 삽입 로그 주석처리

24
frontend/.gitignore vendored Normal file
파일 보기

@ -0,0 +1,24 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
lerna-debug.log*
node_modules
dist
dist-ssr
*.local
# Editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
.DS_Store
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?

파일 보기

@ -12,6 +12,7 @@ const Executions = lazy(() => import('./pages/Executions'));
const ExecutionDetail = lazy(() => import('./pages/ExecutionDetail'));
const Schedules = lazy(() => import('./pages/Schedules'));
const Timeline = lazy(() => import('./pages/Timeline'));
const SyncStatus = lazy(() => import('./pages/SyncStatus'));
function AppLayout() {
const { toasts, removeToast } = useToastContext();
@ -28,6 +29,7 @@ function AppLayout() {
<Route path="/executions/:id" element={<ExecutionDetail />} />
<Route path="/schedules" element={<Schedules />} />
<Route path="/schedule-timeline" element={<Timeline />} />
<Route path="/sync-status" element={<SyncStatus />} />
</Routes>
</Suspense>
</div>

파일 보기

@ -26,6 +26,16 @@ async function putJson<T>(url: string, body?: unknown): Promise<T> {
return res.json();
}
async function patchJson<T>(url: string, body?: unknown): Promise<T> {
const res = await fetch(url, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json' },
body: body ? JSON.stringify(body) : undefined,
});
if (!res.ok) throw new Error(`API Error: ${res.status} ${res.statusText}`);
return res.json();
}
async function deleteJson<T>(url: string): Promise<T> {
const res = await fetch(url, { method: 'DELETE' });
if (!res.ok) throw new Error(`API Error: ${res.status} ${res.statusText}`);
@ -273,6 +283,48 @@ export interface ExecutionStatisticsDto {
avgDurationMs: number;
}
// ── Sync Status ─────────────────────────────────────────────
export interface SyncStatusSummary {
totalTables: number;
pendingCount: number;
processingCount: number;
completedCount: number;
stuckTables: number;
}
export interface SyncTableStatus {
tableKey: string;
sourceTable: string;
targetTable: string;
domain: string;
pendingCount: number;
processingCount: number;
completedCount: number;
lastSyncTime: string | null;
stuck: boolean;
}
export interface SyncDomainGroup {
domain: string;
domainLabel: string;
tables: SyncTableStatus[];
}
export interface SyncStatusResponse {
summary: SyncStatusSummary;
domains: SyncDomainGroup[];
}
export interface SyncDataPreviewResponse {
tableKey: string;
targetTable: string;
targetSchema: string;
columns: string[];
rows: Record<string, unknown>[];
totalCount: number;
}
// ── API Functions ────────────────────────────────────────────
export const batchApi = {
@ -358,7 +410,7 @@ export const batchApi = {
deleteJson<{ success: boolean; message: string }>(`${BASE}/schedules/${jobName}`),
toggleSchedule: (jobName: string, active: boolean) =>
postJson<{ success: boolean; message: string; data?: ScheduleResponse }>(
patchJson<{ success: boolean; message: string; data?: ScheduleResponse }>(
`${BASE}/schedules/${jobName}/toggle`, { active }),
// Timeline
@ -399,4 +451,18 @@ export const batchApi = {
resetRetryCount: (ids: number[]) =>
postJson<{ success: boolean; message: string; resetCount?: number }>(
`${BASE}/failed-records/reset-retry`, { ids }),
// Sync Status
getSyncStatus: () =>
fetchJson<SyncStatusResponse>(`${BASE}/sync-status`),
getSyncDataPreview: (tableKey: string, limit = 10) =>
fetchJson<SyncDataPreviewResponse>(`${BASE}/sync-status/${tableKey}/preview?limit=${limit}`),
getStuckRecords: (tableKey: string, limit = 50) =>
fetchJson<SyncDataPreviewResponse>(`${BASE}/sync-status/${tableKey}/stuck?limit=${limit}`),
resetStuckRecords: (tableKey: string) =>
postJson<{ success: boolean; message: string; resetCount?: number }>(
`${BASE}/sync-status/${tableKey}/reset`),
};

파일 보기

@ -7,6 +7,7 @@ const navItems = [
{ path: '/jobs', label: '작업', icon: '⚙️' },
{ path: '/schedules', label: '스케줄', icon: '🕐' },
{ path: '/schedule-timeline', label: '타임라인', icon: '📅' },
{ path: '/sync-status', label: '동기화 현황', icon: '🔄' },
];
export default function Navbar() {

파일 보기

@ -0,0 +1,127 @@
import { useState, useEffect } from 'react';
import { batchApi, type SyncDataPreviewResponse } from '../api/batchApi';
import LoadingSpinner from './LoadingSpinner';
interface Props {
open: boolean;
tableKey: string;
tableName: string;
onClose: () => void;
onReset: () => void;
}
export default function StuckRecordsModal({ open, tableKey, tableName, onClose, onReset }: Props) {
const [data, setData] = useState<SyncDataPreviewResponse | null>(null);
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
if (!open || !tableKey) return;
setLoading(true);
setError(null);
batchApi.getStuckRecords(tableKey, 50)
.then(setData)
.catch((e) => setError(e.message))
.finally(() => setLoading(false));
}, [open, tableKey]);
if (!open) return null;
return (
<div className="fixed inset-0 z-50 flex items-center justify-center bg-wing-overlay" onClick={onClose}>
<div
className="bg-wing-surface rounded-xl shadow-2xl max-w-5xl w-full mx-4 max-h-[80vh] flex flex-col"
onClick={(e) => e.stopPropagation()}
>
{/* Header */}
<div className="flex items-center justify-between px-6 py-4 border-b border-wing-border">
<div>
<div className="flex items-center gap-2">
<span className="text-red-500"></span>
<h3 className="text-lg font-semibold text-wing-text">P </h3>
</div>
<p className="text-xs text-wing-muted mt-0.5">
{tableName}
{data ? ` | ${data.targetSchema}.${data.targetTable} | 총 ${data.totalCount.toLocaleString()}건 고착` : ''}
</p>
</div>
<div className="flex items-center gap-2">
{data && data.totalCount > 0 && (
<button
onClick={onReset}
className="px-3 py-1.5 text-xs font-medium text-white bg-red-500 hover:bg-red-600 rounded-lg transition-colors"
>
PN
</button>
)}
<button
onClick={onClose}
className="px-3 py-1.5 text-sm text-wing-muted hover:text-wing-text transition-colors"
>
</button>
</div>
</div>
{/* Body */}
<div className="flex-1 overflow-auto p-4">
{loading && <LoadingSpinner />}
{error && (
<div className="text-center py-8 text-red-400"> : {error}</div>
)}
{!loading && !error && data && data.rows.length === 0 && (
<div className="text-center py-8 text-wing-muted">P </div>
)}
{!loading && !error && data && data.rows.length > 0 && (
<div className="overflow-x-auto">
<table className="w-full text-xs">
<thead>
<tr className="border-b border-wing-border">
{data.columns.map((col) => (
<th
key={col}
className={`px-3 py-2 text-left font-medium whitespace-nowrap bg-wing-card
${col === 'batch_flag' ? 'text-red-500' : 'text-wing-muted'}`}
>
{col}
</th>
))}
</tr>
</thead>
<tbody>
{data.rows.map((row, idx) => (
<tr key={idx} className="border-b border-wing-border/50 hover:bg-wing-hover">
{data.columns.map((col) => (
<td
key={col}
className={`px-3 py-1.5 whitespace-nowrap max-w-[200px] truncate
${col === 'batch_flag' ? 'text-red-500 font-bold' : 'text-wing-text'}`}
>
{formatCellValue(row[col])}
</td>
))}
</tr>
))}
</tbody>
</table>
</div>
)}
</div>
{/* Footer */}
{data && data.rows.length > 0 && (
<div className="px-6 py-3 border-t border-wing-border text-xs text-wing-muted flex items-center justify-between">
<span>{data.rows.length} ( {data.totalCount.toLocaleString()})</span>
<span className="text-red-400"> batch_flag가 PN으로 </span>
</div>
)}
</div>
</div>
);
}
function formatCellValue(value: unknown): string {
if (value === null || value === undefined) return '-';
if (typeof value === 'object') return JSON.stringify(value);
return String(value);
}

파일 보기

@ -0,0 +1,108 @@
import { useState, useEffect } from 'react';
import { batchApi, type SyncDataPreviewResponse } from '../api/batchApi';
import LoadingSpinner from './LoadingSpinner';
interface Props {
open: boolean;
tableKey: string;
tableName: string;
onClose: () => void;
}
export default function SyncDataPreviewModal({ open, tableKey, tableName, onClose }: Props) {
const [data, setData] = useState<SyncDataPreviewResponse | null>(null);
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
if (!open || !tableKey) return;
setLoading(true);
setError(null);
batchApi.getSyncDataPreview(tableKey, 10)
.then(setData)
.catch((e) => setError(e.message))
.finally(() => setLoading(false));
}, [open, tableKey]);
if (!open) return null;
return (
<div className="fixed inset-0 z-50 flex items-center justify-center bg-wing-overlay" onClick={onClose}>
<div
className="bg-wing-surface rounded-xl shadow-2xl max-w-5xl w-full mx-4 max-h-[80vh] flex flex-col"
onClick={(e) => e.stopPropagation()}
>
{/* Header */}
<div className="flex items-center justify-between px-6 py-4 border-b border-wing-border">
<div>
<h3 className="text-lg font-semibold text-wing-text">{tableName}</h3>
<p className="text-xs text-wing-muted mt-0.5">
{data ? `${data.targetSchema}.${data.targetTable} | 총 ${data.totalCount.toLocaleString()}` : ''}
</p>
</div>
<button
onClick={onClose}
className="px-3 py-1.5 text-sm text-wing-muted hover:text-wing-text transition-colors"
>
</button>
</div>
{/* Body */}
<div className="flex-1 overflow-auto p-4">
{loading && <LoadingSpinner />}
{error && (
<div className="text-center py-8 text-wing-muted">
<p className="text-red-400"> : {error}</p>
</div>
)}
{!loading && !error && data && data.rows.length === 0 && (
<div className="text-center py-8 text-wing-muted"> </div>
)}
{!loading && !error && data && data.rows.length > 0 && (
<div className="overflow-x-auto">
<table className="w-full text-xs">
<thead>
<tr className="border-b border-wing-border">
{data.columns.map((col) => (
<th
key={col}
className="px-3 py-2 text-left font-medium text-wing-muted whitespace-nowrap bg-wing-card"
>
{col}
</th>
))}
</tr>
</thead>
<tbody>
{data.rows.map((row, idx) => (
<tr key={idx} className="border-b border-wing-border/50 hover:bg-wing-hover">
{data.columns.map((col) => (
<td key={col} className="px-3 py-1.5 text-wing-text whitespace-nowrap max-w-[200px] truncate">
{formatCellValue(row[col])}
</td>
))}
</tr>
))}
</tbody>
</table>
</div>
)}
</div>
{/* Footer */}
{data && data.rows.length > 0 && (
<div className="px-6 py-3 border-t border-wing-border text-xs text-wing-muted">
{data.rows.length} ( {data.totalCount.toLocaleString()})
</div>
)}
</div>
</div>
);
}
function formatCellValue(value: unknown): string {
if (value === null || value === undefined) return '-';
if (typeof value === 'object') return JSON.stringify(value);
return String(value);
}

파일 보기

@ -0,0 +1,393 @@
import { useState, useCallback, useEffect } from 'react';
import {
batchApi,
type SyncStatusResponse,
type SyncTableStatus,
type SyncDataPreviewResponse,
} from '../api/batchApi';
import { usePoller } from '../hooks/usePoller';
import { useToastContext } from '../contexts/ToastContext';
import LoadingSpinner from '../components/LoadingSpinner';
import EmptyState from '../components/EmptyState';
import ConfirmModal from '../components/ConfirmModal';
import GuideModal, { HelpButton } from '../components/GuideModal';
const POLLING_INTERVAL = 30000;
const DOMAIN_ICONS: Record<string, string> = {
ship: '🚢',
company: '🏢',
event: '⚠️',
facility: '🏭',
psc: '🔍',
movements: '📍',
code: '🏷️',
'risk-compliance': '🛡️',
};
const GUIDE_ITEMS = [
{
title: '도메인 탭',
content: 'Ship, PSC 등 도메인별로 테이블을 그룹핑하여 조회합니다.\nP 고착 테이블이 있는 도메인에는 경고 뱃지가 표시됩니다.',
},
{
title: '테이블 아코디언',
content: '각 테이블을 펼치면 대기(N)/진행(P)/완료(S) 건수와 상세 데이터를 확인할 수 있습니다.\n⚠ 표시는 P 상태에 고착된 레코드가 있음을 의미합니다.',
},
{
title: '동기화 데이터 / P 상태 레코드',
content: '동기화 데이터 탭: 타겟 스키마(std_snp_svc)의 최근 동기화 데이터를 보여줍니다.\nP 상태 레코드 탭: Writer 실패로 P 상태에 멈춘 레코드를 확인하고 리셋할 수 있습니다.',
},
];
export default function SyncStatus() {
const { showToast } = useToastContext();
const [data, setData] = useState<SyncStatusResponse | null>(null);
const [loading, setLoading] = useState(true);
const [guideOpen, setGuideOpen] = useState(false);
// Tab & accordion state
const [activeDomain, setActiveDomain] = useState<string>('ship');
const [expandedTable, setExpandedTable] = useState<string>('ship-001');
const [detailTabs, setDetailTabs] = useState<Record<string, 'preview' | 'stuck'>>({});
// Reset confirm
const [resetTableKey, setResetTableKey] = useState('');
const [resetConfirmOpen, setResetConfirmOpen] = useState(false);
const [resetting, setResetting] = useState(false);
const loadData = useCallback(async () => {
try {
const result = await batchApi.getSyncStatus();
setData(result);
} catch {
if (loading) showToast('동기화 현황 조회 실패', 'error');
} finally {
setLoading(false);
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
usePoller(loadData, POLLING_INTERVAL);
const toggleAccordion = (tableKey: string) => {
setExpandedTable((prev) => (prev === tableKey ? '' : tableKey));
};
const getDetailTab = (tableKey: string) => detailTabs[tableKey] || 'preview';
const setDetailTab = (tableKey: string, tab: 'preview' | 'stuck') => {
setDetailTabs((prev) => ({ ...prev, [tableKey]: tab }));
};
const handleReset = async () => {
setResetting(true);
try {
const result = await batchApi.resetStuckRecords(resetTableKey);
const allTables = data?.domains.flatMap((d) => d.tables) ?? [];
const table = allTables.find((t) => t.tableKey === resetTableKey);
showToast(`${table?.sourceTable ?? resetTableKey}: ${result.resetCount}건 리셋 완료`, 'success');
setResetConfirmOpen(false);
loadData();
} catch {
showToast('리셋 실패', 'error');
} finally {
setResetting(false);
}
};
const activeDomainGroup = data?.domains.find((d) => d.domain === activeDomain);
const resetTable = data?.domains.flatMap((d) => d.tables).find((t) => t.tableKey === resetTableKey);
if (loading) return <LoadingSpinner />;
if (!data) return <EmptyState message="데이터를 불러올 수 없습니다" />;
return (
<div>
{/* Header */}
<div className="flex items-center justify-between mb-6">
<h1 className="text-2xl font-bold text-wing-text"> </h1>
<HelpButton onClick={() => setGuideOpen(true)} />
</div>
{/* ── Domain Tabs ── */}
<div className="flex gap-0 overflow-x-auto border-b border-wing-border mb-4">
{data.domains.map((d) => {
const stuckCount = d.tables.filter((t) => t.stuck).length;
return (
<button
key={d.domain}
onClick={() => setActiveDomain(d.domain)}
className={`px-4 py-2.5 text-sm font-medium whitespace-nowrap border-b-2 transition-colors
${activeDomain === d.domain
? 'border-wing-accent text-wing-accent'
: 'border-transparent text-wing-muted hover:text-wing-text hover:border-wing-border'
}`}
>
<span className="mr-1">{DOMAIN_ICONS[d.domain] || ''}</span>
{d.domainLabel}
<span className="ml-1 text-xs opacity-60">({d.tables.length})</span>
{stuckCount > 0 && (
<span className="ml-1.5 inline-flex items-center justify-center w-5 h-5 text-[10px] font-bold text-white bg-red-500 rounded-full">
{stuckCount}
</span>
)}
</button>
);
})}
</div>
{/* ── Table Accordions ── */}
{activeDomainGroup && (
<div className="space-y-2">
{activeDomainGroup.tables.map((table) => (
<TableAccordion
key={table.tableKey}
table={table}
expanded={expandedTable === table.tableKey}
detailTab={getDetailTab(table.tableKey)}
onToggle={() => toggleAccordion(table.tableKey)}
onDetailTabChange={(tab) => setDetailTab(table.tableKey, tab)}
onReset={() => { setResetTableKey(table.tableKey); setResetConfirmOpen(true); }}
/>
))}
</div>
)}
{/* Reset Confirm Modal */}
<ConfirmModal
open={resetConfirmOpen}
title="P→N 리셋 확인"
message={`${resetTable?.sourceTable ?? ''}의 P 상태 레코드를 모두 N(대기)으로 리셋하시겠습니까?\n리셋된 레코드는 다음 동기화 실행 시 재처리됩니다.`}
confirmLabel={resetting ? '리셋 중...' : '리셋'}
onConfirm={handleReset}
onCancel={() => setResetConfirmOpen(false)}
/>
<GuideModal
open={guideOpen}
pageTitle="동기화 현황"
sections={GUIDE_ITEMS}
onClose={() => setGuideOpen(false)}
/>
</div>
);
}
// ── Sub Components ────────────────────────────────────────────
interface TableAccordionProps {
table: SyncTableStatus;
expanded: boolean;
detailTab: 'preview' | 'stuck';
onToggle: () => void;
onDetailTabChange: (tab: 'preview' | 'stuck') => void;
onReset: () => void;
}
function TableAccordion({ table, expanded, detailTab, onToggle, onDetailTabChange, onReset }: TableAccordionProps) {
return (
<div className={`bg-wing-card rounded-xl border overflow-hidden
${table.stuck ? 'border-amber-400 ring-1 ring-amber-100' : 'border-wing-border'}`}>
{/* Accordion header */}
<button
onClick={onToggle}
className="w-full flex items-center justify-between px-5 py-3 hover:bg-wing-hover transition-colors text-left"
>
<div className="flex items-center gap-3">
{table.stuck && <span className="text-red-500"></span>}
<div>
<span className="text-wing-text font-semibold text-sm">{table.targetTable}</span>
<span className="text-xs text-wing-muted ml-2">{table.tableKey}</span>
</div>
</div>
<div className="flex items-center gap-2 text-xs">
<span className="inline-flex items-center gap-1 px-2.5 py-0.5 rounded-full font-semibold tabular-nums bg-amber-100 text-amber-700">
{table.pendingCount.toLocaleString()}
</span>
<span className={`inline-flex items-center gap-1 px-2.5 py-0.5 rounded-full font-semibold tabular-nums
${table.stuck ? 'bg-red-100 text-red-700' : 'bg-blue-100 text-blue-700'}`}>
{table.processingCount.toLocaleString()}
</span>
<span className="inline-flex items-center gap-1 px-2.5 py-0.5 rounded-full font-semibold tabular-nums bg-emerald-100 text-emerald-700">
{table.completedCount.toLocaleString()}
</span>
<span className="text-wing-muted w-16 text-right ml-1">{formatRelativeTime(table.lastSyncTime)}</span>
<span className="text-wing-muted">{expanded ? '▲' : '▼'}</span>
</div>
</button>
{/* Accordion body */}
{expanded && (
<div className="border-t border-wing-border p-5">
{/* Stats row */}
<div className="grid grid-cols-2 md:grid-cols-4 gap-3 mb-4">
<MiniStat label="대기 (N)" value={table.pendingCount} color="text-amber-600 dark:text-amber-400" />
<MiniStat label="진행 (P)" value={table.processingCount} color="text-blue-600 dark:text-blue-400"
warn={table.stuck} />
<MiniStat label="완료 (S)" value={table.completedCount} color="text-emerald-600 dark:text-emerald-400" />
<div className="bg-wing-bg rounded-lg p-3">
<p className="text-xs text-wing-muted"> </p>
<p className="text-sm font-medium text-wing-text mt-0.5">{formatRelativeTime(table.lastSyncTime)}</p>
</div>
</div>
{/* Detail sub-tabs */}
<div className="flex items-center justify-between mb-3">
<div className="flex gap-1">
<button
onClick={() => onDetailTabChange('preview')}
className={`px-3 py-1.5 rounded-lg text-xs font-medium transition-colors
${detailTab === 'preview'
? 'bg-wing-accent text-white'
: 'bg-wing-bg text-wing-muted hover:bg-wing-hover'
}`}
>
</button>
<button
onClick={() => onDetailTabChange('stuck')}
className={`px-3 py-1.5 rounded-lg text-xs font-medium transition-colors
${detailTab === 'stuck'
? 'bg-wing-accent text-white'
: 'bg-wing-bg text-wing-muted hover:bg-wing-hover'
}`}
>
P
{table.processingCount > 0 && (
<span className="ml-1 text-red-300">({table.processingCount.toLocaleString()})</span>
)}
</button>
</div>
{detailTab === 'stuck' && table.stuck && (
<button
onClick={onReset}
className="px-3 py-1.5 text-xs font-medium text-white bg-red-500 hover:bg-red-600 rounded-lg transition-colors"
>
PN
</button>
)}
</div>
{/* Tab content */}
{detailTab === 'preview' && (
<InlineDataTable tableKey={table.tableKey} fetchFn={batchApi.getSyncDataPreview} />
)}
{detailTab === 'stuck' && (
<InlineDataTable tableKey={table.tableKey} fetchFn={batchApi.getStuckRecords} />
)}
</div>
)}
</div>
);
}
interface MiniStatProps {
label: string;
value: number;
color: string;
warn?: boolean;
}
function MiniStat({ label, value, color, warn }: MiniStatProps) {
return (
<div className={`bg-wing-bg rounded-lg p-3 ${warn ? 'ring-1 ring-red-400' : ''}`}>
<p className="text-xs text-wing-muted">{label}</p>
<p className={`text-lg font-bold mt-0.5 tabular-nums ${color}`}>
{value.toLocaleString()}
{warn && <span className="ml-1 text-xs text-red-500"></span>}
</p>
</div>
);
}
interface InlineDataTableProps {
tableKey: string;
fetchFn: (tableKey: string, limit: number) => Promise<SyncDataPreviewResponse>;
}
function InlineDataTable({ tableKey, fetchFn }: InlineDataTableProps) {
const [data, setData] = useState<SyncDataPreviewResponse | null>(null);
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
setLoading(true);
setError(null);
setData(null);
fetchFn(tableKey, 20)
.then(setData)
.catch((e) => setError(e.message))
.finally(() => setLoading(false));
}, [tableKey, fetchFn]);
if (loading) return <div className="py-8"><LoadingSpinner /></div>;
if (error) return <div className="text-center py-8 text-red-400 text-sm"> : {error}</div>;
if (!data || data.rows.length === 0) {
return <div className="text-center py-8 text-wing-muted text-sm"> </div>;
}
return (
<div>
<div className="overflow-x-auto rounded-lg border border-wing-border">
<table className="w-full text-xs">
<thead>
<tr className="border-b border-wing-border">
{data.columns.map((col) => (
<th
key={col}
className={`px-3 py-2 text-left font-medium whitespace-nowrap bg-wing-bg
${col === 'batch_flag' ? 'text-blue-500' : 'text-wing-muted'}`}
>
{col}
</th>
))}
</tr>
</thead>
<tbody>
{data.rows.map((row, idx) => (
<tr key={idx} className="border-b border-wing-border/50 hover:bg-wing-hover">
{data.columns.map((col) => (
<td
key={col}
className={`px-3 py-1.5 whitespace-nowrap max-w-[200px] truncate
${col === 'batch_flag' ? 'font-bold text-blue-600 dark:text-blue-400' : 'text-wing-text'}`}
>
{formatCellValue(row[col])}
</td>
))}
</tr>
))}
</tbody>
</table>
</div>
<p className="text-xs text-wing-muted mt-2">
{data.rows.length} ( {data.totalCount.toLocaleString()}) &middot; {data.targetSchema}.{data.targetTable}
</p>
</div>
);
}
function formatCellValue(value: unknown): string {
if (value === null || value === undefined) return '-';
if (typeof value === 'object') return JSON.stringify(value);
return String(value);
}
function formatRelativeTime(dateStr: string | null): string {
if (!dateStr) return '-';
try {
const date = new Date(dateStr);
if (isNaN(date.getTime())) return '-';
const now = new Date();
const diffMs = now.getTime() - date.getTime();
const diffMin = Math.floor(diffMs / 60000);
if (diffMin < 1) return '방금 전';
if (diffMin < 60) return `${diffMin}분 전`;
const diffHour = Math.floor(diffMin / 60);
if (diffHour < 24) return `${diffHour}시간 전`;
const diffDay = Math.floor(diffHour / 24);
return `${diffDay}일 전`;
} catch {
return '-';
}
}

파일 보기

@ -94,7 +94,7 @@ public abstract class BaseJobConfig<I, O> {
if (processor != null) {
var chunkBuilder = stepBuilder
.<I, O>chunk(getChunkSize(), transactionManager)
.<I, O>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(processor)
.writer(createWriter());
@ -104,7 +104,7 @@ public abstract class BaseJobConfig<I, O> {
} else {
@SuppressWarnings("unchecked")
var chunkBuilder = stepBuilder
.<I, I>chunk(getChunkSize(), transactionManager)
.<I, I>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.writer((ItemWriter<? super I>) createWriter());

파일 보기

@ -55,7 +55,7 @@ public abstract class BaseProcessor<I, O> implements ItemProcessor<I, O> {
return null;
}
log.debug("데이터 처리 중: {}", item);
// log.debug("데이터 처리 중: {}", item);
return processItem(item);
}
}

파일 보기

@ -0,0 +1,98 @@
package com.snp.batch.common.batch.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.JobExecutionGroupable;
import com.snp.batch.common.util.TableMetaInfo;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.List;
/**
* 동기화 Reader 추상 클래스
*
* 1 chunk = 1 job_execution_id 보장:
* - 그룹의 데이터를 모두 반환한 null을 반환하여 청크 종료
* - chunk(Integer.MAX_VALUE) 함께 사용하여 Reader가 청크 경계를 제어
* - 다음 그룹의 NP 전환은 이전 그룹의 청크 처리(Write + PS) 완료된 후에만 발생
*
* @param <T> DTO 타입 (JobExecutionGroupable 구현 필요)
*/
@Slf4j
public abstract class BaseSyncReader<T extends JobExecutionGroupable> implements ItemReader<T> {
protected final TableMetaInfo tableMetaInfo;
protected final JdbcTemplate businessJdbcTemplate;
private List<T> allDataBuffer = new ArrayList<>();
private Long currentGroupId = null;
protected BaseSyncReader(DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
}
/**
* 소스 테이블명 반환 (: tableMetaInfo.sourceIceClass)
*/
protected abstract String getSourceTable();
/**
* ResultSet DTO 매핑
*/
protected abstract T mapRow(ResultSet rs, Long targetId) throws SQLException;
protected String getLogPrefix() {
return getClass().getSimpleName();
}
@Override
public T read() throws Exception {
if (allDataBuffer.isEmpty()) {
// 이전 그룹 처리 완료 null 반환하여 청크 종료
// (Writer + afterWrite(PS) 실행된 다음 청크에서 다음 그룹 로드)
if (currentGroupId != null) {
currentGroupId = null;
return null;
}
// 다음 그룹 로드
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null; // 이상 처리할 데이터 없음 Step 종료
}
return allDataBuffer.remove(0);
}
private void fetchNextGroup() {
Long nextTargetId;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(getSourceTable()), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId == null) return;
log.info("[{}] 다음 처리 대상 ID 발견: {}", getLogPrefix(), nextTargetId);
String sql = CommonSql.getTargetDataQuery(getSourceTable());
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) ->
mapRow(rs, nextTargetId), nextTargetId);
// NP 전환
String updateSql = CommonSql.getProcessBatchQuery(getSourceTable());
businessJdbcTemplate.update(updateSql, nextTargetId);
currentGroupId = nextTargetId;
}
}

파일 보기

@ -203,7 +203,7 @@ public abstract class BaseJdbcRepository<T, ID> {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", getEntityName(), entities.size());
// log.debug("{} 배치 삽입 시작: {} 건", getEntityName(), entities.size());
jdbcTemplate.batchUpdate(getInsertSql(), entities, entities.size(),
(ps, entity) -> {
@ -215,7 +215,7 @@ public abstract class BaseJdbcRepository<T, ID> {
}
});
log.debug("{} 배치 삽입 완료: {} 건", getEntityName(), entities.size());
// log.debug("{} 배치 삽입 완료: {} 건", getEntityName(), entities.size());
}
/**

파일 보기

@ -78,7 +78,7 @@ public abstract class MultiDataSourceJdbcRepository<T, ID> {
return;
}
log.debug("{} 배치 삽입 시작: {} 건 (Business DB)", getEntityName(), entities.size());
// log.debug("{} 배치 삽입 시작: {} 건 (Business DB)", getEntityName(), entities.size());
// businessJdbcTemplate 사용
businessJdbcTemplate.batchUpdate(getInsertSql(), entities, entities.size(),
@ -91,7 +91,7 @@ public abstract class MultiDataSourceJdbcRepository<T, ID> {
}
});
log.debug("{} 배치 삽입 완료: {} 건", getEntityName(), entities.size());
// log.debug("{} 배치 삽입 완료: {} 건", getEntityName(), entities.size());
}
// ... (나머지 find, save, update, delete 메서드도 businessJdbcTemplate을 사용하여 구현합니다.)

파일 보기

@ -1,51 +1,55 @@
package com.snp.batch.common.util;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.ItemWriteListener;
import org.springframework.batch.item.Chunk;
import org.springframework.jdbc.core.JdbcTemplate;
/**
* Writer 성공 batch_flag PS 업데이트 리스너
*
* SQL은 실행 시점에 생성 (CommonSql.SOURCE_SCHEMA 초기화 보장)
*/
@Slf4j
@RequiredArgsConstructor
public class BatchWriteListener<S extends JobExecutionGroupable> implements ItemWriteListener<S> {
private final JdbcTemplate businessJdbcTemplate;
private final String updateSql; // 실행할 쿼리 (: "UPDATE ... SET batch_flag = 'S' ...")
private final String sourceTable;
public BatchWriteListener(JdbcTemplate businessJdbcTemplate, String sourceTable) {
this.businessJdbcTemplate = businessJdbcTemplate;
this.sourceTable = sourceTable;
}
@Override
public void afterWrite(Chunk<? extends S> items) {
// afterWrite는 Writer가 예외 없이 성공했을 때만 실행되는 것이 보장되어야
if (items.isEmpty()) return;
Long jobExecutionId = items.getItems().get(0).getJobExecutionId();
try {
int updatedRows = businessJdbcTemplate.update(updateSql, jobExecutionId);
// SQL을 실행 시점에 생성하여 SOURCE_SCHEMA null 문제 방지
String sql = CommonSql.getCompleteBatchQuery(sourceTable);
int updatedRows = businessJdbcTemplate.update(sql, jobExecutionId);
log.info("[BatchWriteListener] Success update 'S'. jobExecutionId: {}, rows: {}", jobExecutionId, updatedRows);
} catch (Exception e) {
log.error("[BatchWriteListener] Update 'S' failed. jobExecutionId: {}", jobExecutionId, e);
// 중요: 리스너의 업데이트가 실패해도 배치를 중단시키려면 예외를 던져야
throw e;
}
}
@Override
public void onWriteError(Exception exception, Chunk<? extends S> items) {
// Writer에서 에러가 발생하면 메서드가 호출됨
if (!items.isEmpty()) {
Long jobExecutionId = items.getItems().get(0).getJobExecutionId();
log.error("[BatchWriteListener] Write Error Detected! jobExecutionId: {}. Status will NOT be updated to 'S'. Error: {}",
jobExecutionId, exception.getMessage());
}
// 중요: 여기서 예외를 다시 던져야 배치가 중단(FAILED)
// 만약 여기서 예외를 던지지 않으면 배치는 다음 청크를 계속 시도할 있음
if (exception instanceof RuntimeException) {
throw (RuntimeException) exception;
} else {
throw new RuntimeException("Force stop batch due to write error", exception);
}
}
}

파일 보기

@ -13,9 +13,6 @@ public class TableMetaInfo {
*/
// Ship Tables
@Value("${app.batch.source-schema.tables.ship-001}")
public String sourceShipData;
@Value("${app.batch.source-schema.tables.ship-002}")
public String sourceShipDetailData;
@ -158,10 +155,10 @@ public class TableMetaInfo {
@Value("${app.batch.source-schema.tables.risk-compliance-001}")
public String sourceRisk;
@Value("${app.batch.source-schema.tables.risk-compliance-002}")
@Value("${app.batch.source-schema.tables.risk-compliance-003}")
public String sourceCompliance;
@Value("${app.batch.source-schema.tables.risk-compliance-003}")
@Value("${app.batch.source-schema.tables.risk-compliance-006}")
public String sourceTbCompanyComplianceInfo;
@ -172,11 +169,8 @@ public class TableMetaInfo {
*/
// Ship Tables
@Value("${app.batch.target-schema.tables.ship-001}")
public String targetTbShipInfoMst;
@Value("${app.batch.target-schema.tables.ship-002}")
public String targetTbShipMainInfo;
public String targetTbShipInfoMst;
@Value("${app.batch.target-schema.tables.ship-003}")
public String targetTbShipAddInfo;

파일 보기

@ -0,0 +1,26 @@
package com.snp.batch.global.config;
import lombok.Getter;
import lombok.Setter;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
import java.util.LinkedHashMap;
import java.util.Map;
@Getter
@Setter
@Component
@ConfigurationProperties(prefix = "app.batch")
public class BatchTableProperties {
private SchemaConfig sourceSchema = new SchemaConfig();
private SchemaConfig targetSchema = new SchemaConfig();
@Getter
@Setter
public static class SchemaConfig {
private String name;
private Map<String, String> tables = new LinkedHashMap<>();
}
}

파일 보기

@ -3,8 +3,11 @@ package com.snp.batch.global.controller;
import com.snp.batch.global.dto.JobExecutionDto;
import com.snp.batch.global.dto.ScheduleRequest;
import com.snp.batch.global.dto.ScheduleResponse;
import com.snp.batch.global.dto.SyncDataPreviewResponse;
import com.snp.batch.global.dto.SyncStatusResponse;
import com.snp.batch.service.BatchService;
import com.snp.batch.service.ScheduleService;
import com.snp.batch.service.SyncStatusService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
@ -27,6 +30,7 @@ public class BatchController {
private final BatchService batchService;
private final ScheduleService scheduleService;
private final SyncStatusService syncStatusService;
@Operation(summary = "배치 작업 실행", description = "지정된 배치 작업을 즉시 실행합니다. 쿼리 파라미터로 Job Parameters 전달 가능")
@ApiResponses(value = {
@ -137,6 +141,61 @@ public class BatchController {
}
}
// Stale / Abandon API
@Operation(summary = "장기 실행(stale) 목록 조회", description = "thresholdMinutes 이상 실행 중인 배치 목록을 조회합니다")
@GetMapping("/executions/stale")
public ResponseEntity<List<JobExecutionDto>> getStaleExecutions(
@RequestParam(defaultValue = "60") int thresholdMinutes) {
List<JobExecutionDto> staleExecutions = batchService.getStaleExecutions(thresholdMinutes);
return ResponseEntity.ok(staleExecutions);
}
@Operation(summary = "실행 강제 종료(abandon)", description = "특정 배치 실행을 강제 종료합니다")
@PostMapping("/executions/{executionId}/abandon")
public ResponseEntity<Map<String, Object>> abandonExecution(@PathVariable Long executionId) {
log.info("Received request to abandon execution: {}", executionId);
try {
batchService.abandonExecution(executionId);
return ResponseEntity.ok(Map.of(
"success", true,
"message", "Execution abandoned"
));
} catch (IllegalArgumentException | IllegalStateException e) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"message", e.getMessage()
));
} catch (Exception e) {
log.error("Error abandoning execution: {}", executionId, e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"message", "Failed to abandon execution: " + e.getMessage()
));
}
}
@Operation(summary = "stale 실행 일괄 강제 종료", description = "장기 실행 중인 모든 배치를 일괄 강제 종료합니다")
@PostMapping("/executions/stale/abandon-all")
public ResponseEntity<Map<String, Object>> abandonAllStale(
@RequestParam(defaultValue = "60") int thresholdMinutes) {
log.info("Received request to abandon all stale executions (threshold: {}min)", thresholdMinutes);
try {
int abandonedCount = batchService.abandonAllStale(thresholdMinutes);
return ResponseEntity.ok(Map.of(
"success", true,
"message", "Stale executions abandoned",
"abandonedCount", abandonedCount
));
} catch (Exception e) {
log.error("Error abandoning stale executions", e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"message", "Failed to abandon stale executions: " + e.getMessage()
));
}
}
@Operation(summary = "스케줄 목록 조회", description = "등록된 모든 스케줄을 조회합니다")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "조회 성공")
@ -324,4 +383,84 @@ public class BatchController {
com.snp.batch.global.dto.ExecutionStatisticsDto stats = batchService.getJobStatistics(jobName, days);
return ResponseEntity.ok(stats);
}
// 동기화 현황 API
@Operation(summary = "동기화 현황 조회", description = "전체 테이블의 batch_flag 기반 동기화 현황을 조회합니다")
@GetMapping("/sync-status")
public ResponseEntity<SyncStatusResponse> getSyncStatus() {
log.info("Received request to get sync status");
try {
SyncStatusResponse status = syncStatusService.getSyncStatus();
return ResponseEntity.ok(status);
} catch (Exception e) {
log.error("Error getting sync status", e);
return ResponseEntity.internalServerError().build();
}
}
@Operation(summary = "동기화 데이터 미리보기", description = "특정 테이블의 최근 동기화 성공 데이터를 조회합니다")
@GetMapping("/sync-status/{tableKey}/preview")
public ResponseEntity<SyncDataPreviewResponse> getSyncDataPreview(
@Parameter(description = "테이블 키 (예: ship-001)", required = true)
@PathVariable String tableKey,
@Parameter(description = "조회 건수", example = "10")
@RequestParam(defaultValue = "10") int limit) {
log.info("Received request to preview sync data for: {}", tableKey);
try {
SyncDataPreviewResponse preview = syncStatusService.getDataPreview(tableKey, limit);
return ResponseEntity.ok(preview);
} catch (IllegalArgumentException e) {
return ResponseEntity.badRequest().build();
} catch (Exception e) {
log.error("Error getting sync data preview for: {}", tableKey, e);
return ResponseEntity.internalServerError().build();
}
}
@Operation(summary = "P 상태 고착 레코드 조회", description = "특정 테이블의 batch_flag='P' 고착 레코드를 조회합니다")
@GetMapping("/sync-status/{tableKey}/stuck")
public ResponseEntity<SyncDataPreviewResponse> getStuckRecords(
@Parameter(description = "테이블 키 (예: ship-001)", required = true)
@PathVariable String tableKey,
@Parameter(description = "조회 건수", example = "50")
@RequestParam(defaultValue = "50") int limit) {
log.info("Received request to get stuck records for: {}", tableKey);
try {
SyncDataPreviewResponse stuck = syncStatusService.getStuckRecords(tableKey, limit);
return ResponseEntity.ok(stuck);
} catch (IllegalArgumentException e) {
return ResponseEntity.badRequest().build();
} catch (Exception e) {
log.error("Error getting stuck records for: {}", tableKey, e);
return ResponseEntity.internalServerError().build();
}
}
@Operation(summary = "P 상태 고착 레코드 리셋", description = "특정 테이블의 batch_flag='P' 레코드를 'N'으로 리셋합니다")
@PostMapping("/sync-status/{tableKey}/reset")
public ResponseEntity<Map<String, Object>> resetStuckRecords(
@Parameter(description = "테이블 키 (예: ship-001)", required = true)
@PathVariable String tableKey) {
log.info("Received request to reset stuck records for: {}", tableKey);
try {
int resetCount = syncStatusService.resetStuckRecords(tableKey);
return ResponseEntity.ok(Map.of(
"success", true,
"message", "P→N 리셋 완료",
"resetCount", resetCount
));
} catch (IllegalArgumentException e) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"message", e.getMessage()
));
} catch (Exception e) {
log.error("Error resetting stuck records for: {}", tableKey, e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"message", "리셋 실패: " + e.getMessage()
));
}
}
}

파일 보기

@ -13,9 +13,9 @@ import org.springframework.web.bind.annotation.GetMapping;
public class WebViewController {
@GetMapping({"/", "/jobs", "/executions", "/executions/{id:\\d+}",
"/schedules", "/schedule-timeline",
"/schedules", "/schedule-timeline", "/sync-status",
"/jobs/**", "/executions/**",
"/schedules/**", "/schedule-timeline/**"})
"/schedules/**", "/schedule-timeline/**", "/sync-status/**"})
public String forward() {
return "forward:/index.html";
}

파일 보기

@ -0,0 +1,26 @@
package com.snp.batch.global.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Getter;
import lombok.NoArgsConstructor;
import java.util.List;
import java.util.Map;
/**
* 동기화 성공 데이터 미리보기 응답
*/
@Getter
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class SyncDataPreviewResponse {
private String tableKey;
private String targetTable;
private String targetSchema;
private List<String> columns;
private List<Map<String, Object>> rows;
private long totalCount;
}

파일 보기

@ -0,0 +1,59 @@
package com.snp.batch.global.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Getter;
import lombok.NoArgsConstructor;
import java.util.List;
/**
* 동기화 현황 전체 응답
*/
@Getter
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class SyncStatusResponse {
private SyncStatusSummary summary;
private List<SyncDomainGroup> domains;
@Getter
@Builder
@NoArgsConstructor
@AllArgsConstructor
public static class SyncStatusSummary {
private int totalTables;
private long pendingCount;
private long processingCount;
private long completedCount;
private int stuckTables;
}
@Getter
@Builder
@NoArgsConstructor
@AllArgsConstructor
public static class SyncDomainGroup {
private String domain;
private String domainLabel;
private List<SyncTableStatus> tables;
}
@Getter
@Builder
@NoArgsConstructor
@AllArgsConstructor
public static class SyncTableStatus {
private String tableKey;
private String sourceTable;
private String targetTable;
private String domain;
private long pendingCount;
private long processingCount;
private long completedCount;
private String lastSyncTime;
private boolean stuck;
}
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.code.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.code.dto.FlagCodeDto;
import com.snp.batch.jobs.datasync.batch.code.dto.Stat5CodeDto;
@ -113,14 +110,12 @@ public class CodeSyncJobConfig extends BaseJobConfig<FlagCodeDto, FlagCodeEntity
@Bean
public BatchWriteListener<FlagCodeEntity> flagCodeWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceFlagCode);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceFlagCode);
}
@Bean
public BatchWriteListener<Stat5CodeEntity> stat5CodeWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceStat5Code);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceStat5Code);
}
// --- Steps ---
@ -129,12 +124,10 @@ public class CodeSyncJobConfig extends BaseJobConfig<FlagCodeDto, FlagCodeEntity
public Step flagCodeSyncStep() {
log.info("Step 생성: flagCodeSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<FlagCodeDto, FlagCodeEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<FlagCodeDto, FlagCodeEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<FlagCodeDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(flagCodeWriteListener())
.build();
}
@ -143,12 +136,10 @@ public class CodeSyncJobConfig extends BaseJobConfig<FlagCodeDto, FlagCodeEntity
public Step stat5CodeSyncStep() {
log.info("Step 생성: stat5CodeSyncStep");
return new StepBuilder("stat5CodeSyncStep", jobRepository)
.<Stat5CodeDto, Stat5CodeEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<Stat5CodeDto, Stat5CodeEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(stat5CodeReader(businessDataSource, tableMetaInfo))
.processor(new Stat5CodeProcessor())
.writer(new Stat5CodeWriter(codeRepository))
.listener(new GroupByExecutionIdReadListener<Stat5CodeDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(stat5CodeWriteListener())
.build();
}

파일 보기

@ -1,68 +1,36 @@
package com.snp.batch.jobs.datasync.batch.code.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.code.dto.FlagCodeDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class FlagCodeReader implements ItemReader<FlagCodeDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<FlagCodeDto> allDataBuffer = new ArrayList<>();
public class FlagCodeReader extends BaseSyncReader<FlagCodeDto> {
public FlagCodeReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public FlagCodeDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceFlagCode;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceFlagCode), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[FlagCodeReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceFlagCode);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return FlagCodeDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.shipCountryCd(rs.getString("ship_country_cd"))
.cdNm(rs.getString("cd_nm"))
.isoTwoCd(rs.getString("iso_two_cd"))
.isoThrCd(rs.getString("iso_thr_cd"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceFlagCode);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected FlagCodeDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return FlagCodeDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.shipCountryCd(rs.getString("ship_country_cd"))
.cdNm(rs.getString("cd_nm"))
.isoTwoCd(rs.getString("iso_two_cd"))
.isoThrCd(rs.getString("iso_thr_cd"))
.build();
}
}

파일 보기

@ -1,75 +1,43 @@
package com.snp.batch.jobs.datasync.batch.code.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.code.dto.Stat5CodeDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class Stat5CodeReader implements ItemReader<Stat5CodeDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<Stat5CodeDto> allDataBuffer = new ArrayList<>();
public class Stat5CodeReader extends BaseSyncReader<Stat5CodeDto> {
public Stat5CodeReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public Stat5CodeDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceStat5Code;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceStat5Code), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[Stat5CodeReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceStat5Code);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return Stat5CodeDto.builder()
.jobExecutionId(targetId)
.lvOne(rs.getString("lv_one"))
.lvOneDesc(rs.getString("lv_one_desc"))
.lvTwo(rs.getString("lv_two"))
.lvTwoDesc(rs.getString("lv_two_desc"))
.lvThr(rs.getString("lv_thr"))
.lvThrDesc(rs.getString("lv_thr_desc"))
.lvFour(rs.getString("lv_four"))
.lvFourDesc(rs.getString("lv_four_desc"))
.lvFive(rs.getString("lv_five"))
.lvFiveDesc(rs.getString("lv_five_desc"))
.dtlDesc(rs.getString("dtl_desc"))
.rlsIem(rs.getString("rls_iem"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceStat5Code);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected Stat5CodeDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return Stat5CodeDto.builder()
.jobExecutionId(targetId)
.lvOne(rs.getString("lv_one"))
.lvOneDesc(rs.getString("lv_one_desc"))
.lvTwo(rs.getString("lv_two"))
.lvTwoDesc(rs.getString("lv_two_desc"))
.lvThr(rs.getString("lv_thr"))
.lvThrDesc(rs.getString("lv_thr_desc"))
.lvFour(rs.getString("lv_four"))
.lvFourDesc(rs.getString("lv_four_desc"))
.lvFive(rs.getString("lv_five"))
.lvFiveDesc(rs.getString("lv_five_desc"))
.dtlDesc(rs.getString("dtl_desc"))
.rlsIem(rs.getString("rls_iem"))
.build();
}
}

파일 보기

@ -80,7 +80,7 @@ public class CodeRepositoryImpl extends MultiDataSourceJdbcRepository<FlagCodeEn
if (flagCodeEntityList == null || flagCodeEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "FlagCodeEntity", flagCodeEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "FlagCodeEntity", flagCodeEntityList.size());
batchJdbcTemplate.batchUpdate(sql, flagCodeEntityList, flagCodeEntityList.size(),
(ps, entity) -> {
@ -92,7 +92,7 @@ public class CodeRepositoryImpl extends MultiDataSourceJdbcRepository<FlagCodeEn
}
});
log.debug("{} 배치 삽입 완료: {} 건", "FlagCodeEntity", flagCodeEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "FlagCodeEntity", flagCodeEntityList.size());
}
public void bindFlagCode(PreparedStatement pstmt, FlagCodeEntity entity) throws Exception {
@ -111,7 +111,7 @@ public class CodeRepositoryImpl extends MultiDataSourceJdbcRepository<FlagCodeEn
if (stat5CodeEntityList == null || stat5CodeEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "Stat5CodeEntity", stat5CodeEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "Stat5CodeEntity", stat5CodeEntityList.size());
batchJdbcTemplate.batchUpdate(sql, stat5CodeEntityList, stat5CodeEntityList.size(),
(ps, entity) -> {
@ -123,7 +123,7 @@ public class CodeRepositoryImpl extends MultiDataSourceJdbcRepository<FlagCodeEn
}
});
log.debug("{} 배치 삽입 완료: {} 건", "Stat5CodeEntity", stat5CodeEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "Stat5CodeEntity", stat5CodeEntityList.size());
}
public void bindStat5Code(PreparedStatement pstmt, Stat5CodeEntity entity) throws Exception {

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.compliance.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.compliance.dto.CompanyComplianceDto;
import com.snp.batch.jobs.datasync.batch.compliance.entity.CompanyComplianceEntity;
@ -99,8 +96,7 @@ public class CompanyComplianceSyncJobConfig extends BaseJobConfig<CompanyComplia
@Bean
public BatchWriteListener<CompanyComplianceEntity> companyComplianceWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTbCompanyComplianceInfo);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTbCompanyComplianceInfo);
}
// --- Steps ---
@ -109,12 +105,10 @@ public class CompanyComplianceSyncJobConfig extends BaseJobConfig<CompanyComplia
public Step companyComplianceSyncStep() {
log.info("Step 생성: companyComplianceSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<CompanyComplianceDto, CompanyComplianceEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<CompanyComplianceDto, CompanyComplianceEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<CompanyComplianceDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(companyComplianceWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.compliance.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.compliance.dto.ShipComplianceDto;
import com.snp.batch.jobs.datasync.batch.compliance.entity.ShipComplianceEntity;
@ -99,8 +96,7 @@ public class ShipComplianceSyncJobConfig extends BaseJobConfig<ShipComplianceDto
@Bean
public BatchWriteListener<ShipComplianceEntity> shipComplianceWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceCompliance);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceCompliance);
}
// --- Steps ---
@ -109,12 +105,10 @@ public class ShipComplianceSyncJobConfig extends BaseJobConfig<ShipComplianceDto
public Step shipComplianceSyncStep() {
log.info("Step 생성: shipComplianceSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<ShipComplianceDto, ShipComplianceEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<ShipComplianceDto, ShipComplianceEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<ShipComplianceDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(shipComplianceWriteListener())
.build();
}

파일 보기

@ -1,82 +1,50 @@
package com.snp.batch.jobs.datasync.batch.compliance.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.compliance.dto.CompanyComplianceDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class CompanyComplianceReader implements ItemReader<CompanyComplianceDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<CompanyComplianceDto> allDataBuffer = new ArrayList<>();
public class CompanyComplianceReader extends BaseSyncReader<CompanyComplianceDto> {
public CompanyComplianceReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public CompanyComplianceDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTbCompanyComplianceInfo;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTbCompanyComplianceInfo), Long.class);
} catch (Exception e) {
return;
}
@Override
protected CompanyComplianceDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp lstMdfcnDtTs = rs.getTimestamp("lst_mdfcn_dt");
if (nextTargetId != null) {
log.info("[CompanyComplianceReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTbCompanyComplianceInfo);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp lstMdfcnDtTs = rs.getTimestamp("lst_mdfcn_dt");
return CompanyComplianceDto.builder()
.jobExecutionId(targetId)
.companyCd(rs.getString("company_cd"))
.lstMdfcnDt(lstMdfcnDtTs != null ? lstMdfcnDtTs.toLocalDateTime() : null)
.companySnthsComplianceStatus(rs.getObject("company_snths_compliance_status") != null ? rs.getLong("company_snths_compliance_status") : null)
.companyAusSanctionList(rs.getObject("company_aus_sanction_list") != null ? rs.getLong("company_aus_sanction_list") : null)
.companyBesSanctionList(rs.getObject("company_bes_sanction_list") != null ? rs.getLong("company_bes_sanction_list") : null)
.companyCanSanctionList(rs.getObject("company_can_sanction_list") != null ? rs.getLong("company_can_sanction_list") : null)
.companyOfacSanctionCountry(rs.getObject("company_ofac_sanction_country") != null ? rs.getLong("company_ofac_sanction_country") : null)
.companyFatfCmptncCountry(rs.getObject("company_fatf_cmptnc_country") != null ? rs.getLong("company_fatf_cmptnc_country") : null)
.companyEuSanctionList(rs.getObject("company_eu_sanction_list") != null ? rs.getLong("company_eu_sanction_list") : null)
.companyOfacSanctionList(rs.getObject("company_ofac_sanction_list") != null ? rs.getLong("company_ofac_sanction_list") : null)
.companyOfacNonSdnSanctionList(rs.getObject("company_ofac_non_sdn_sanction_list") != null ? rs.getLong("company_ofac_non_sdn_sanction_list") : null)
.companyOfacssiSanctionList(rs.getObject("company_ofacssi_sanction_list") != null ? rs.getLong("company_ofacssi_sanction_list") : null)
.companySwissSanctionList(rs.getObject("company_swiss_sanction_list") != null ? rs.getLong("company_swiss_sanction_list") : null)
.companyUaeSanctionList(rs.getObject("company_uae_sanction_list") != null ? rs.getLong("company_uae_sanction_list") : null)
.companyUnSanctionList(rs.getObject("company_un_sanction_list") != null ? rs.getLong("company_un_sanction_list") : null)
.prntCompanyComplianceRisk(rs.getObject("prnt_company_compliance_risk") != null ? rs.getLong("prnt_company_compliance_risk") : null)
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTbCompanyComplianceInfo);
businessJdbcTemplate.update(sql, targetExecutionId);
return CompanyComplianceDto.builder()
.jobExecutionId(targetId)
.companyCd(rs.getString("company_cd"))
.lstMdfcnDt(lstMdfcnDtTs != null ? lstMdfcnDtTs.toLocalDateTime() : null)
.companySnthsComplianceStatus(rs.getObject("company_snths_compliance_status") != null ? rs.getLong("company_snths_compliance_status") : null)
.companyAusSanctionList(rs.getObject("company_aus_sanction_list") != null ? rs.getLong("company_aus_sanction_list") : null)
.companyBesSanctionList(rs.getObject("company_bes_sanction_list") != null ? rs.getLong("company_bes_sanction_list") : null)
.companyCanSanctionList(rs.getObject("company_can_sanction_list") != null ? rs.getLong("company_can_sanction_list") : null)
.companyOfacSanctionCountry(rs.getObject("company_ofac_sanction_country") != null ? rs.getLong("company_ofac_sanction_country") : null)
.companyFatfCmptncCountry(rs.getObject("company_fatf_cmptnc_country") != null ? rs.getLong("company_fatf_cmptnc_country") : null)
.companyEuSanctionList(rs.getObject("company_eu_sanction_list") != null ? rs.getLong("company_eu_sanction_list") : null)
.companyOfacSanctionList(rs.getObject("company_ofac_sanction_list") != null ? rs.getLong("company_ofac_sanction_list") : null)
.companyOfacNonSdnSanctionList(rs.getObject("company_ofac_non_sdn_sanction_list") != null ? rs.getLong("company_ofac_non_sdn_sanction_list") : null)
.companyOfacssiSanctionList(rs.getObject("company_ofacssi_sanction_list") != null ? rs.getLong("company_ofacssi_sanction_list") : null)
.companySwissSanctionList(rs.getObject("company_swiss_sanction_list") != null ? rs.getLong("company_swiss_sanction_list") : null)
.companyUaeSanctionList(rs.getObject("company_uae_sanction_list") != null ? rs.getLong("company_uae_sanction_list") : null)
.companyUnSanctionList(rs.getObject("company_un_sanction_list") != null ? rs.getLong("company_un_sanction_list") : null)
.prntCompanyComplianceRisk(rs.getObject("prnt_company_compliance_risk") != null ? rs.getLong("prnt_company_compliance_risk") : null)
.build();
}
}

파일 보기

@ -1,101 +1,69 @@
package com.snp.batch.jobs.datasync.batch.compliance.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.compliance.dto.ShipComplianceDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class ShipComplianceReader implements ItemReader<ShipComplianceDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<ShipComplianceDto> allDataBuffer = new ArrayList<>();
public class ShipComplianceReader extends BaseSyncReader<ShipComplianceDto> {
public ShipComplianceReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public ShipComplianceDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceCompliance;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceCompliance), Long.class);
} catch (Exception e) {
return;
}
@Override
protected ShipComplianceDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
if (nextTargetId != null) {
log.info("[ShipComplianceReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceCompliance);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
return ShipComplianceDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toLocalDateTime() : null)
.lglSnthsSanction(rs.getString("lgl_snths_sanction"))
.shipBesSanctionList(rs.getString("ship_bes_sanction_list"))
.shipDarkActvInd(rs.getString("ship_dark_actv_ind"))
.shipDtldInfoNtmntd(rs.getString("ship_dtld_info_ntmntd"))
.shipEuSanctionList(rs.getString("ship_eu_sanction_list"))
.shipFlgDspt(rs.getString("ship_flg_dspt"))
.shipFlgSanctionCountry(rs.getString("ship_flg_sanction_country"))
.shipFlgSanctionCountryHstry(rs.getString("ship_flg_sanction_country_hstry"))
.shipOfacNonSdnSanctionList(rs.getString("ship_ofac_non_sdn_sanction_list"))
.shipOfacSanctionList(rs.getString("ship_ofac_sanction_list"))
.shipOfacCutnList(rs.getString("ship_ofac_cutn_list"))
.shipOwnrOfcsSanctionList(rs.getString("ship_ownr_ofcs_sanction_list"))
.shipOwnrAusSanctionList(rs.getString("ship_ownr_aus_sanction_list"))
.shipOwnrBesSanctionList(rs.getString("ship_ownr_bes_sanction_list"))
.shipOwnrCanSanctionList(rs.getString("ship_ownr_can_sanction_list"))
.shipOwnrEuSanctionList(rs.getString("ship_ownr_eu_sanction_list"))
.shipOwnrFatfRglZone(rs.getString("ship_ownr_fatf_rgl_zone"))
.shipOwnrOfacSanctionHstry(rs.getString("ship_ownr_ofac_sanction_hstry"))
.shipOwnrOfacSanctionList(rs.getString("ship_ownr_ofac_sanction_list"))
.shipOwnrOfacSanctionCountry(rs.getString("ship_ownr_ofac_sanction_country"))
.shipOwnrPrntCompanyNcmplnc(rs.getString("ship_ownr_prnt_company_ncmplnc"))
.shipOwnrPrntCompanyFatfRglZone(rs.getString("ship_ownr_prnt_company_fatf_rgl_zone"))
.shipOwnrPrntCompanyOfacSanctionCountry(rs.getString("ship_ownr_prnt_company_ofac_sanction_country"))
.shipOwnrSwiSanctionList(rs.getString("ship_ownr_swi_sanction_list"))
.shipOwnrUaeSanctionList(rs.getString("ship_ownr_uae_sanction_list"))
.shipOwnrUnSanctionList(rs.getString("ship_ownr_un_sanction_list"))
.shipSanctionCountryPrtcllLastTwelveM(rs.getString("ship_sanction_country_prtcll_last_twelve_m"))
.shipSanctionCountryPrtcllLastThrM(rs.getString("ship_sanction_country_prtcll_last_thr_m"))
.shipSanctionCountryPrtcllLastSixM(rs.getString("ship_sanction_country_prtcll_last_six_m"))
.shipScrtyLglDsptEvent(rs.getString("ship_scrty_lgl_dspt_event"))
.shipStsPrtnrNonComplianceTwelveM(rs.getString("ship_sts_prtnr_non_compliance_twelve_m"))
.shipSwiSanctionList(rs.getString("ship_swi_sanction_list"))
.shipUnSanctionList(rs.getString("ship_un_sanction_list"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceCompliance);
businessJdbcTemplate.update(sql, targetExecutionId);
return ShipComplianceDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toLocalDateTime() : null)
.lglSnthsSanction(rs.getString("lgl_snths_sanction"))
.shipBesSanctionList(rs.getString("ship_bes_sanction_list"))
.shipDarkActvInd(rs.getString("ship_dark_actv_ind"))
.shipDtldInfoNtmntd(rs.getString("ship_dtld_info_ntmntd"))
.shipEuSanctionList(rs.getString("ship_eu_sanction_list"))
.shipFlgDspt(rs.getString("ship_flg_dspt"))
.shipFlgSanctionCountry(rs.getString("ship_flg_sanction_country"))
.shipFlgSanctionCountryHstry(rs.getString("ship_flg_sanction_country_hstry"))
.shipOfacNonSdnSanctionList(rs.getString("ship_ofac_non_sdn_sanction_list"))
.shipOfacSanctionList(rs.getString("ship_ofac_sanction_list"))
.shipOfacCutnList(rs.getString("ship_ofac_cutn_list"))
.shipOwnrOfcsSanctionList(rs.getString("ship_ownr_ofcs_sanction_list"))
.shipOwnrAusSanctionList(rs.getString("ship_ownr_aus_sanction_list"))
.shipOwnrBesSanctionList(rs.getString("ship_ownr_bes_sanction_list"))
.shipOwnrCanSanctionList(rs.getString("ship_ownr_can_sanction_list"))
.shipOwnrEuSanctionList(rs.getString("ship_ownr_eu_sanction_list"))
.shipOwnrFatfRglZone(rs.getString("ship_ownr_fatf_rgl_zone"))
.shipOwnrOfacSanctionHstry(rs.getString("ship_ownr_ofac_sanction_hstry"))
.shipOwnrOfacSanctionList(rs.getString("ship_ownr_ofac_sanction_list"))
.shipOwnrOfacSanctionCountry(rs.getString("ship_ownr_ofac_sanction_country"))
.shipOwnrPrntCompanyNcmplnc(rs.getString("ship_ownr_prnt_company_ncmplnc"))
.shipOwnrPrntCompanyFatfRglZone(rs.getString("ship_ownr_prnt_company_fatf_rgl_zone"))
.shipOwnrPrntCompanyOfacSanctionCountry(rs.getString("ship_ownr_prnt_company_ofac_sanction_country"))
.shipOwnrSwiSanctionList(rs.getString("ship_ownr_swi_sanction_list"))
.shipOwnrUaeSanctionList(rs.getString("ship_ownr_uae_sanction_list"))
.shipOwnrUnSanctionList(rs.getString("ship_ownr_un_sanction_list"))
.shipSanctionCountryPrtcllLastTwelveM(rs.getString("ship_sanction_country_prtcll_last_twelve_m"))
.shipSanctionCountryPrtcllLastThrM(rs.getString("ship_sanction_country_prtcll_last_thr_m"))
.shipSanctionCountryPrtcllLastSixM(rs.getString("ship_sanction_country_prtcll_last_six_m"))
.shipScrtyLglDsptEvent(rs.getString("ship_scrty_lgl_dspt_event"))
.shipStsPrtnrNonComplianceTwelveM(rs.getString("ship_sts_prtnr_non_compliance_twelve_m"))
.shipSwiSanctionList(rs.getString("ship_swi_sanction_list"))
.shipUnSanctionList(rs.getString("ship_un_sanction_list"))
.build();
}
}

파일 보기

@ -80,7 +80,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
if (shipComplianceEntityList == null || shipComplianceEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "ShipComplianceEntity", shipComplianceEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "ShipComplianceEntity", shipComplianceEntityList.size());
batchJdbcTemplate.batchUpdate(sql, shipComplianceEntityList, shipComplianceEntityList.size(),
(ps, entity) -> {
@ -92,7 +92,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
}
});
log.debug("{} 배치 삽입 완료: {} 건", "ShipComplianceEntity", shipComplianceEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "ShipComplianceEntity", shipComplianceEntityList.size());
}
@Override
@ -122,7 +122,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
if (companyComplianceEntityList == null || companyComplianceEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "CompanyComplianceEntity", companyComplianceEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "CompanyComplianceEntity", companyComplianceEntityList.size());
batchJdbcTemplate.batchUpdate(sql, companyComplianceEntityList, companyComplianceEntityList.size(),
(ps, entity) -> {
@ -134,7 +134,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
}
});
log.debug("{} 배치 삽입 완료: {} 건", "CompanyComplianceEntity", companyComplianceEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "CompanyComplianceEntity", companyComplianceEntityList.size());
}
@Override
@ -281,7 +281,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
if (companyComplianceChangeEntityList == null || companyComplianceChangeEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "CompanyComplianceChangeEntity", companyComplianceChangeEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "CompanyComplianceChangeEntity", companyComplianceChangeEntityList.size());
batchJdbcTemplate.batchUpdate(sql, companyComplianceChangeEntityList, companyComplianceChangeEntityList.size(),
(ps, entity) -> {
@ -293,7 +293,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
}
});
log.debug("{} 배치 삽입 완료: {} 건", "CompanyComplianceChangeEntity", companyComplianceChangeEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "CompanyComplianceChangeEntity", companyComplianceChangeEntityList.size());
}
public void bindCompanyComplianceChange(PreparedStatement pstmt, CompanyComplianceChangeEntity entity) throws Exception {
@ -312,7 +312,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
if (shipComplianceChangeEntityList == null || shipComplianceChangeEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "ShipComplianceChangeEntity", shipComplianceChangeEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "ShipComplianceChangeEntity", shipComplianceChangeEntityList.size());
batchJdbcTemplate.batchUpdate(sql, shipComplianceChangeEntityList, shipComplianceChangeEntityList.size(),
(ps, entity) -> {
@ -324,7 +324,7 @@ public class ComplianceRepositoryImpl extends MultiDataSourceJdbcRepository<Ship
}
});
log.debug("{} 배치 삽입 완료: {} 건", "ShipComplianceChangeEntity", shipComplianceChangeEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "ShipComplianceChangeEntity", shipComplianceChangeEntityList.size());
}
public void bindShipComplianceChange(PreparedStatement pstmt, ShipComplianceChangeEntity entity) throws Exception {

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.event.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.event.dto.EventCargoDto;
import com.snp.batch.jobs.datasync.batch.event.dto.EventDto;
@ -147,26 +144,22 @@ public class EventSyncJobConfig extends BaseJobConfig<EventDto, EventEntity> {
@Bean
public BatchWriteListener<EventEntity> eventWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceEvent);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceEvent);
}
@Bean
public BatchWriteListener<EventCargoEntity> eventCargoWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceEventCargo);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceEventCargo);
}
@Bean
public BatchWriteListener<EventHumanCasualtyEntity> eventHumanCasualtyWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceEventHumanCasualty);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceEventHumanCasualty);
}
@Bean
public BatchWriteListener<EventRelationshipEntity> eventRelationshipWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceEventRelationship);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceEventRelationship);
}
// --- Steps ---
@ -175,12 +168,10 @@ public class EventSyncJobConfig extends BaseJobConfig<EventDto, EventEntity> {
public Step eventSyncStep() {
log.info("Step 생성: eventSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<EventDto, EventEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<EventDto, EventEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<EventDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(eventWriteListener())
.build();
}
@ -189,12 +180,10 @@ public class EventSyncJobConfig extends BaseJobConfig<EventDto, EventEntity> {
public Step eventCargoSyncStep() {
log.info("Step 생성: eventCargoSyncStep");
return new StepBuilder("eventCargoSyncStep", jobRepository)
.<EventCargoDto, EventCargoEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<EventCargoDto, EventCargoEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(eventCargoReader(businessDataSource, tableMetaInfo))
.processor(new EventCargoProcessor())
.writer(new EventCargoWriter(eventRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<EventCargoDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(eventCargoWriteListener())
.build();
}
@ -203,12 +192,10 @@ public class EventSyncJobConfig extends BaseJobConfig<EventDto, EventEntity> {
public Step eventHumanCasualtySyncStep() {
log.info("Step 생성: eventHumanCasualtySyncStep");
return new StepBuilder("eventHumanCasualtySyncStep", jobRepository)
.<EventHumanCasualtyDto, EventHumanCasualtyEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<EventHumanCasualtyDto, EventHumanCasualtyEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(eventHumanCasualtyReader(businessDataSource, tableMetaInfo))
.processor(new EventHumanCasualtyProcessor())
.writer(new EventHumanCasualtyWriter(eventRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<EventHumanCasualtyDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(eventHumanCasualtyWriteListener())
.build();
}
@ -217,12 +204,10 @@ public class EventSyncJobConfig extends BaseJobConfig<EventDto, EventEntity> {
public Step eventRelationshipSyncStep() {
log.info("Step 생성: eventRelationshipSyncStep");
return new StepBuilder("eventRelationshipSyncStep", jobRepository)
.<EventRelationshipDto, EventRelationshipEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<EventRelationshipDto, EventRelationshipEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(eventRelationshipReader(businessDataSource, tableMetaInfo))
.processor(new EventRelationshipProcessor())
.writer(new EventRelationshipWriter(eventRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<EventRelationshipDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(eventRelationshipWriteListener())
.build();
}

파일 보기

@ -1,73 +1,41 @@
package com.snp.batch.jobs.datasync.batch.event.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.event.dto.EventCargoDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class EventCargoReader implements ItemReader<EventCargoDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<EventCargoDto> allDataBuffer = new ArrayList<>();
public class EventCargoReader extends BaseSyncReader<EventCargoDto> {
public EventCargoReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public EventCargoDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceEventCargo;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceEventCargo), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[EventCargoReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceEventCargo);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return EventCargoDto.builder()
.jobExecutionId(targetId)
.eventId(rs.getObject("event_id") != null ? rs.getInt("event_id") : null)
.imoNo(rs.getString("imo_no"))
.type(rs.getString("type"))
.eventSeq(rs.getString("event_seq"))
.cnt(rs.getObject("cnt") != null ? rs.getLong("cnt") : null)
.unitAbbr(rs.getString("unit_abbr"))
.unit(rs.getString("unit"))
.cargoDamg(rs.getString("cargo_damg"))
.riskYn(rs.getString("risk_yn"))
.text(rs.getString("text"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceEventCargo);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected EventCargoDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return EventCargoDto.builder()
.jobExecutionId(targetId)
.eventId(rs.getObject("event_id") != null ? rs.getInt("event_id") : null)
.imoNo(rs.getString("imo_no"))
.type(rs.getString("type"))
.eventSeq(rs.getString("event_seq"))
.cnt(rs.getObject("cnt") != null ? rs.getLong("cnt") : null)
.unitAbbr(rs.getString("unit_abbr"))
.unit(rs.getString("unit"))
.cargoDamg(rs.getString("cargo_damg"))
.riskYn(rs.getString("risk_yn"))
.text(rs.getString("text"))
.build();
}
}

파일 보기

@ -1,68 +1,36 @@
package com.snp.batch.jobs.datasync.batch.event.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.event.dto.EventHumanCasualtyDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class EventHumanCasualtyReader implements ItemReader<EventHumanCasualtyDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<EventHumanCasualtyDto> allDataBuffer = new ArrayList<>();
public class EventHumanCasualtyReader extends BaseSyncReader<EventHumanCasualtyDto> {
public EventHumanCasualtyReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public EventHumanCasualtyDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceEventHumanCasualty;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceEventHumanCasualty), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[EventHumanCasualtyReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceEventHumanCasualty);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return EventHumanCasualtyDto.builder()
.jobExecutionId(targetId)
.eventId(rs.getObject("event_id") != null ? rs.getLong("event_id") : null)
.type(rs.getString("type"))
.scope(rs.getString("scope"))
.qualfr(rs.getString("qualfr"))
.cnt(rs.getObject("cnt") != null ? rs.getLong("cnt") : null)
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceEventHumanCasualty);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected EventHumanCasualtyDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return EventHumanCasualtyDto.builder()
.jobExecutionId(targetId)
.eventId(rs.getObject("event_id") != null ? rs.getLong("event_id") : null)
.type(rs.getString("type"))
.scope(rs.getString("scope"))
.qualfr(rs.getString("qualfr"))
.cnt(rs.getObject("cnt") != null ? rs.getLong("cnt") : null)
.build();
}
}

파일 보기

@ -1,110 +1,79 @@
package com.snp.batch.jobs.datasync.batch.event.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.event.dto.EventDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
import java.time.ZoneId;
@Slf4j
public class EventReader implements ItemReader<EventDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<EventDto> allDataBuffer = new ArrayList<>();
public class EventReader extends BaseSyncReader<EventDto> {
public EventReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public EventDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceEvent;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceEvent), Long.class);
} catch (Exception e) {
return;
}
@Override
protected EventDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp pstgYmdTs = rs.getTimestamp("pstg_ymd");
Timestamp eventStartDayTs = rs.getTimestamp("event_start_day");
Timestamp eventEndDayTs = rs.getTimestamp("event_end_day");
if (nextTargetId != null) {
log.info("[EventReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceEvent);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp pstgYmdTs = rs.getTimestamp("pstg_ymd");
Timestamp eventStartDayTs = rs.getTimestamp("event_start_day");
Timestamp eventEndDayTs = rs.getTimestamp("event_end_day");
return EventDto.builder()
.jobExecutionId(targetId)
.eventId(rs.getObject("event_id") != null ? rs.getInt("event_id") : null)
.acdntId(rs.getString("acdnt_id"))
.imoNo(rs.getString("imo_no"))
.pstgYmd(pstgYmdTs != null ? pstgYmdTs.toInstant().atZone(java.time.ZoneId.systemDefault()) : null)
.eventStartDay(eventStartDayTs != null ? eventStartDayTs.toInstant().atZone(java.time.ZoneId.systemDefault()) : null)
.eventEndDay(eventEndDayTs != null ? eventEndDayTs.toInstant().atZone(java.time.ZoneId.systemDefault()) : null)
.embrkTryYn(rs.getString("embrk_try_yn"))
.cargoCapacityStatusCd(rs.getString("cargo_capacity_status_cd"))
.acdntActn(rs.getString("acdnt_actn"))
.acdntZone(rs.getString("acdnt_zone"))
.acdntZoneCd(rs.getString("acdnt_zone_cd"))
.cfgCmpntTwo(rs.getString("cfg_cmpnt_two"))
.countryCd(rs.getString("country_cd"))
.buildYmd(rs.getString("build_ymd"))
.desc(rs.getString("desc"))
.envPosition(rs.getString("env_position"))
.positionNm(rs.getString("position_nm"))
.masdGridRef(rs.getObject("masd_grid_ref") != null ? rs.getLong("masd_grid_ref") : null)
.ctyNm(rs.getString("cty_nm"))
.eventType(rs.getString("event_type"))
.eventTypeDtl(rs.getString("event_type_dtl"))
.eventTypeDtlId(rs.getObject("event_type_dtl_id") != null ? rs.getLong("event_type_dtl_id") : null)
.eventTypeId(rs.getObject("event_type_id") != null ? rs.getLong("event_type_id") : null)
.fireduponYn(rs.getString("firedupon_yn"))
.title(rs.getString("title"))
.ldtTimpt(rs.getObject("ldt_timpt") != null ? rs.getLong("ldt_timpt") : null)
.signfct(rs.getString("signfct"))
.wethr(rs.getString("wethr"))
.pltnMatral(rs.getString("pltn_matral"))
.pltnMatralCnt(rs.getObject("pltn_matral_cnt") != null ? rs.getLong("pltn_matral_cnt") : null)
.pltnMatralUnit(rs.getString("pltn_matral_unit"))
.regShponrCdHr(rs.getString("reg_shponr_cd_hr"))
.regShponrHr(rs.getString("reg_shponr_hr"))
.regShponrCountryCdHr(rs.getString("reg_shponr_country_cd_hr"))
.regShponrCountryHr(rs.getString("reg_shponr_country_hr"))
.shipDwt(rs.getObject("ship_dwt") != null ? rs.getLong("ship_dwt") : null)
.shipFlgCd(rs.getString("ship_flg_cd"))
.shipFlgDecd(rs.getString("ship_flg_decd"))
.shipGt(rs.getObject("ship_gt") != null ? rs.getLong("ship_gt") : null)
.shipNm(rs.getString("ship_nm"))
.shipType(rs.getString("ship_type"))
.shipTypeNm(rs.getString("ship_type_nm"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceEvent);
businessJdbcTemplate.update(sql, targetExecutionId);
return EventDto.builder()
.jobExecutionId(targetId)
.eventId(rs.getObject("event_id") != null ? rs.getInt("event_id") : null)
.acdntId(rs.getString("acdnt_id"))
.imoNo(rs.getString("imo_no"))
.pstgYmd(pstgYmdTs != null ? pstgYmdTs.toInstant().atZone(ZoneId.systemDefault()) : null)
.eventStartDay(eventStartDayTs != null ? eventStartDayTs.toInstant().atZone(ZoneId.systemDefault()) : null)
.eventEndDay(eventEndDayTs != null ? eventEndDayTs.toInstant().atZone(ZoneId.systemDefault()) : null)
.embrkTryYn(rs.getString("embrk_try_yn"))
.cargoCapacityStatusCd(rs.getString("cargo_capacity_status_cd"))
.acdntActn(rs.getString("acdnt_actn"))
.acdntZone(rs.getString("acdnt_zone"))
.acdntZoneCd(rs.getString("acdnt_zone_cd"))
.cfgCmpntTwo(rs.getString("cfg_cmpnt_two"))
.countryCd(rs.getString("country_cd"))
.buildYmd(rs.getString("build_ymd"))
.desc(rs.getString("desc"))
.envPosition(rs.getString("env_position"))
.positionNm(rs.getString("position_nm"))
.masdGridRef(rs.getObject("masd_grid_ref") != null ? rs.getLong("masd_grid_ref") : null)
.ctyNm(rs.getString("cty_nm"))
.eventType(rs.getString("event_type"))
.eventTypeDtl(rs.getString("event_type_dtl"))
.eventTypeDtlId(rs.getObject("event_type_dtl_id") != null ? rs.getLong("event_type_dtl_id") : null)
.eventTypeId(rs.getObject("event_type_id") != null ? rs.getLong("event_type_id") : null)
.fireduponYn(rs.getString("firedupon_yn"))
.title(rs.getString("title"))
.ldtTimpt(rs.getObject("ldt_timpt") != null ? rs.getLong("ldt_timpt") : null)
.signfct(rs.getString("signfct"))
.wethr(rs.getString("wethr"))
.pltnMatral(rs.getString("pltn_matral"))
.pltnMatralCnt(rs.getObject("pltn_matral_cnt") != null ? rs.getLong("pltn_matral_cnt") : null)
.pltnMatralUnit(rs.getString("pltn_matral_unit"))
.regShponrCdHr(rs.getString("reg_shponr_cd_hr"))
.regShponrHr(rs.getString("reg_shponr_hr"))
.regShponrCountryCdHr(rs.getString("reg_shponr_country_cd_hr"))
.regShponrCountryHr(rs.getString("reg_shponr_country_hr"))
.shipDwt(rs.getObject("ship_dwt") != null ? rs.getLong("ship_dwt") : null)
.shipFlgCd(rs.getString("ship_flg_cd"))
.shipFlgDecd(rs.getString("ship_flg_decd"))
.shipGt(rs.getObject("ship_gt") != null ? rs.getLong("ship_gt") : null)
.shipNm(rs.getString("ship_nm"))
.shipType(rs.getString("ship_type"))
.shipTypeNm(rs.getString("ship_type_nm"))
.build();
}
}

파일 보기

@ -1,70 +1,38 @@
package com.snp.batch.jobs.datasync.batch.event.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.event.dto.EventRelationshipDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class EventRelationshipReader implements ItemReader<EventRelationshipDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<EventRelationshipDto> allDataBuffer = new ArrayList<>();
public class EventRelationshipReader extends BaseSyncReader<EventRelationshipDto> {
public EventRelationshipReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public EventRelationshipDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceEventRelationship;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceEventRelationship), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[EventRelationshipReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceEventRelationship);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return EventRelationshipDto.builder()
.jobExecutionId(targetId)
.acdntId(rs.getString("acdnt_id"))
.eventId(rs.getObject("event_id") != null ? rs.getLong("event_id") : null)
.eventIdTwo(rs.getObject("event_id_two") != null ? rs.getLong("event_id_two") : null)
.eventTypeCd(rs.getString("event_type_cd"))
.eventType(rs.getString("event_type"))
.relTypeCd(rs.getString("rel_type_cd"))
.relType(rs.getString("rel_type"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceEventRelationship);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected EventRelationshipDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return EventRelationshipDto.builder()
.jobExecutionId(targetId)
.acdntId(rs.getString("acdnt_id"))
.eventId(rs.getObject("event_id") != null ? rs.getLong("event_id") : null)
.eventIdTwo(rs.getObject("event_id_two") != null ? rs.getLong("event_id_two") : null)
.eventTypeCd(rs.getString("event_type_cd"))
.eventType(rs.getString("event_type"))
.relTypeCd(rs.getString("rel_type_cd"))
.relType(rs.getString("rel_type"))
.build();
}
}

파일 보기

@ -84,7 +84,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
if (eventEntityList == null || eventEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "EventEntity", eventEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "EventEntity", eventEntityList.size());
batchJdbcTemplate.batchUpdate(sql, eventEntityList, eventEntityList.size(),
(ps, entity) -> {
@ -96,7 +96,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
}
});
log.debug("{} 배치 삽입 완료: {} 건", "EventEntity", eventEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "EventEntity", eventEntityList.size());
}
public void bindEvent(PreparedStatement pstmt, EventEntity entity) throws Exception {
@ -152,7 +152,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
if (eventCargoEntityList == null || eventCargoEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "EventCargoEntity", eventCargoEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "EventCargoEntity", eventCargoEntityList.size());
batchJdbcTemplate.batchUpdate(sql, eventCargoEntityList, eventCargoEntityList.size(),
(ps, entity) -> {
@ -164,7 +164,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
}
});
log.debug("{} 배치 삽입 완료: {} 건", "EventCargoEntity", eventCargoEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "EventCargoEntity", eventCargoEntityList.size());
}
public void bindEventCargo(PreparedStatement pstmt, EventCargoEntity entity) throws Exception {
@ -188,7 +188,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
if (eventHumanCasualtyEntityList == null || eventHumanCasualtyEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "EventHumanCasualtyEntity", eventHumanCasualtyEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "EventHumanCasualtyEntity", eventHumanCasualtyEntityList.size());
batchJdbcTemplate.batchUpdate(sql, eventHumanCasualtyEntityList, eventHumanCasualtyEntityList.size(),
(ps, entity) -> {
@ -200,7 +200,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
}
});
log.debug("{} 배치 삽입 완료: {} 건", "EventHumanCasualtyEntity", eventHumanCasualtyEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "EventHumanCasualtyEntity", eventHumanCasualtyEntityList.size());
}
public void bindEventHumanCasualty(PreparedStatement pstmt, EventHumanCasualtyEntity entity) throws Exception {
@ -219,7 +219,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
if (eventRelationshipEntityList == null || eventRelationshipEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "EventRelationshipEntity", eventRelationshipEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "EventRelationshipEntity", eventRelationshipEntityList.size());
batchJdbcTemplate.batchUpdate(sql, eventRelationshipEntityList, eventRelationshipEntityList.size(),
(ps, entity) -> {
@ -231,7 +231,7 @@ public class EventRepositoryImpl extends MultiDataSourceJdbcRepository<EventEnti
}
});
log.debug("{} 배치 삽입 완료: {} 건", "EventRelationshipEntity", eventRelationshipEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "EventRelationshipEntity", eventRelationshipEntityList.size());
}
public void bindEventRelationship(PreparedStatement pstmt, EventRelationshipEntity entity) throws Exception {

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.facility.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.facility.dto.FacilityPortDto;
import com.snp.batch.jobs.datasync.batch.facility.entity.FacilityPortEntity;
@ -102,8 +99,7 @@ public class FacilitySyncJobConfig extends BaseJobConfig<FacilityPortDto, Facili
@Bean
public BatchWriteListener<FacilityPortEntity> facilityPortWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceFacilityPort);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceFacilityPort);
}
// --- Steps ---
@ -112,12 +108,10 @@ public class FacilitySyncJobConfig extends BaseJobConfig<FacilityPortDto, Facili
public Step facilityPortSyncStep() {
log.info("Step 생성: facilityPortSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<FacilityPortDto, FacilityPortEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<FacilityPortDto, FacilityPortEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<FacilityPortDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(facilityPortWriteListener())
.build();
}

파일 보기

@ -1,117 +1,86 @@
package com.snp.batch.jobs.datasync.batch.facility.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.facility.dto.FacilityPortDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
import java.time.ZoneId;
@Slf4j
public class FacilityPortReader implements ItemReader<FacilityPortDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<FacilityPortDto> allDataBuffer = new ArrayList<>();
public class FacilityPortReader extends BaseSyncReader<FacilityPortDto> {
public FacilityPortReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public FacilityPortDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceFacilityPort;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceFacilityPort), Long.class);
} catch (Exception e) {
return;
}
@Override
protected FacilityPortDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
Timestamp regYmdTs = rs.getTimestamp("reg_ymd");
if (nextTargetId != null) {
log.info("[FacilityPortReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceFacilityPort);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
Timestamp regYmdTs = rs.getTimestamp("reg_ymd");
return FacilityPortDto.builder()
.jobExecutionId(targetId)
.portId(rs.getObject("port_id") != null ? rs.getLong("port_id") : null)
.bfrId(rs.getString("bfr_id"))
.status(rs.getString("status"))
.portNm(rs.getString("port_nm"))
.unPortCd(rs.getString("un_port_cd"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.areanm(rs.getString("areanm"))
.cntntnm(rs.getString("cntntnm"))
.mstPortId(rs.getString("mst_port_id"))
.latDecml(rs.getObject("lat_decml") != null ? rs.getDouble("lat_decml") : null)
.lonDecml(rs.getObject("lon_decml") != null ? rs.getDouble("lon_decml") : null)
.positionLat(rs.getObject("position_lat") != null ? rs.getDouble("position_lat") : null)
.positionLon(rs.getObject("position_lon") != null ? rs.getDouble("position_lon") : null)
.positionZVal(rs.getObject("position_z_val") != null ? rs.getDouble("position_z_val") : null)
.positionMvalVal(rs.getObject("position_mval_val") != null ? rs.getDouble("position_mval_val") : null)
.zValHasYn(rs.getObject("z_val_has_yn") != null ? rs.getBoolean("z_val_has_yn") : null)
.mvalValHasYn(rs.getObject("mval_val_has_yn") != null ? rs.getBoolean("mval_val_has_yn") : null)
.positionNulYn(rs.getObject("position_nul_yn") != null ? rs.getBoolean("position_nul_yn") : null)
.positionStsId(rs.getObject("position_sts_id") != null ? rs.getLong("position_sts_id") : null)
.hrZone(rs.getString("hr_zone"))
.daylgtSaveHr(rs.getObject("daylgt_save_hr") != null ? rs.getBoolean("daylgt_save_hr") : null)
.maxDraft(rs.getObject("max_draft") != null ? rs.getDouble("max_draft") : null)
.maxWhlnth(rs.getObject("max_whlnth") != null ? rs.getDouble("max_whlnth") : null)
.maxBeam(rs.getObject("max_beam") != null ? rs.getDouble("max_beam") : null)
.maxDwt(rs.getObject("max_dwt") != null ? rs.getDouble("max_dwt") : null)
.maxSeaDraft(rs.getObject("max_sea_draft") != null ? rs.getDouble("max_sea_draft") : null)
.maxSeaWhlnth(rs.getObject("max_sea_whlnth") != null ? rs.getDouble("max_sea_whlnth") : null)
.maxSeaBcm(rs.getObject("max_sea_bcm") != null ? rs.getDouble("max_sea_bcm") : null)
.maxSeaDwt(rs.getObject("max_sea_dwt") != null ? rs.getDouble("max_sea_dwt") : null)
.baleCargoFacility(rs.getObject("bale_cargo_facility") != null ? rs.getBoolean("bale_cargo_facility") : null)
.cntnrFacility(rs.getObject("cntnr_facility") != null ? rs.getBoolean("cntnr_facility") : null)
.caseCargoFacility(rs.getObject("case_cargo_facility") != null ? rs.getBoolean("case_cargo_facility") : null)
.liquidCargoFacility(rs.getObject("liquid_cargo_facility") != null ? rs.getBoolean("liquid_cargo_facility") : null)
.roroFacility(rs.getObject("roro_facility") != null ? rs.getBoolean("roro_facility") : null)
.paxfclty(rs.getObject("paxfclty") != null ? rs.getBoolean("paxfclty") : null)
.drydkfclty(rs.getObject("drydkfclty") != null ? rs.getBoolean("drydkfclty") : null)
.lpgFacility(rs.getObject("lpg_facility") != null ? rs.getLong("lpg_facility") : null)
.lngFacility(rs.getObject("lng_facility") != null ? rs.getLong("lng_facility") : null)
.lngBnkr(rs.getObject("lng_bnkr") != null ? rs.getBoolean("lng_bnkr") : null)
.doBnkr(rs.getObject("do_bnkr") != null ? rs.getBoolean("do_bnkr") : null)
.foBnkr(rs.getObject("fo_bnkr") != null ? rs.getBoolean("fo_bnkr") : null)
.ispsComplianceYn(rs.getObject("isps_compliance_yn") != null ? rs.getBoolean("isps_compliance_yn") : null)
.csiComplianceYn(rs.getObject("csi_compliance_yn") != null ? rs.getBoolean("csi_compliance_yn") : null)
.freeTrdZone(rs.getObject("free_trd_zone") != null ? rs.getBoolean("free_trd_zone") : null)
.ecfrdPort(rs.getObject("ecfrd_port") != null ? rs.getBoolean("ecfrd_port") : null)
.emsnCtrlArea(rs.getObject("emsn_ctrl_area") != null ? rs.getBoolean("emsn_ctrl_area") : null)
.wsPort(rs.getObject("ws_port") != null ? rs.getLong("ws_port") : null)
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toInstant().atZone(java.time.ZoneId.systemDefault()) : null)
.regYmd(regYmdTs != null ? regYmdTs.toInstant().atZone(java.time.ZoneId.systemDefault()) : null)
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceFacilityPort);
businessJdbcTemplate.update(sql, targetExecutionId);
return FacilityPortDto.builder()
.jobExecutionId(targetId)
.portId(rs.getObject("port_id") != null ? rs.getLong("port_id") : null)
.bfrId(rs.getString("bfr_id"))
.status(rs.getString("status"))
.portNm(rs.getString("port_nm"))
.unPortCd(rs.getString("un_port_cd"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.areanm(rs.getString("areanm"))
.cntntnm(rs.getString("cntntnm"))
.mstPortId(rs.getString("mst_port_id"))
.latDecml(rs.getObject("lat_decml") != null ? rs.getDouble("lat_decml") : null)
.lonDecml(rs.getObject("lon_decml") != null ? rs.getDouble("lon_decml") : null)
.positionLat(rs.getObject("position_lat") != null ? rs.getDouble("position_lat") : null)
.positionLon(rs.getObject("position_lon") != null ? rs.getDouble("position_lon") : null)
.positionZVal(rs.getObject("position_z_val") != null ? rs.getDouble("position_z_val") : null)
.positionMvalVal(rs.getObject("position_mval_val") != null ? rs.getDouble("position_mval_val") : null)
.zValHasYn(rs.getObject("z_val_has_yn") != null ? rs.getBoolean("z_val_has_yn") : null)
.mvalValHasYn(rs.getObject("mval_val_has_yn") != null ? rs.getBoolean("mval_val_has_yn") : null)
.positionNulYn(rs.getObject("position_nul_yn") != null ? rs.getBoolean("position_nul_yn") : null)
.positionStsId(rs.getObject("position_sts_id") != null ? rs.getLong("position_sts_id") : null)
.hrZone(rs.getString("hr_zone"))
.daylgtSaveHr(rs.getObject("daylgt_save_hr") != null ? rs.getBoolean("daylgt_save_hr") : null)
.maxDraft(rs.getObject("max_draft") != null ? rs.getDouble("max_draft") : null)
.maxWhlnth(rs.getObject("max_whlnth") != null ? rs.getDouble("max_whlnth") : null)
.maxBeam(rs.getObject("max_beam") != null ? rs.getDouble("max_beam") : null)
.maxDwt(rs.getObject("max_dwt") != null ? rs.getDouble("max_dwt") : null)
.maxSeaDraft(rs.getObject("max_sea_draft") != null ? rs.getDouble("max_sea_draft") : null)
.maxSeaWhlnth(rs.getObject("max_sea_whlnth") != null ? rs.getDouble("max_sea_whlnth") : null)
.maxSeaBcm(rs.getObject("max_sea_bcm") != null ? rs.getDouble("max_sea_bcm") : null)
.maxSeaDwt(rs.getObject("max_sea_dwt") != null ? rs.getDouble("max_sea_dwt") : null)
.baleCargoFacility(rs.getObject("bale_cargo_facility") != null ? rs.getBoolean("bale_cargo_facility") : null)
.cntnrFacility(rs.getObject("cntnr_facility") != null ? rs.getBoolean("cntnr_facility") : null)
.caseCargoFacility(rs.getObject("case_cargo_facility") != null ? rs.getBoolean("case_cargo_facility") : null)
.liquidCargoFacility(rs.getObject("liquid_cargo_facility") != null ? rs.getBoolean("liquid_cargo_facility") : null)
.roroFacility(rs.getObject("roro_facility") != null ? rs.getBoolean("roro_facility") : null)
.paxfclty(rs.getObject("paxfclty") != null ? rs.getBoolean("paxfclty") : null)
.drydkfclty(rs.getObject("drydkfclty") != null ? rs.getBoolean("drydkfclty") : null)
.lpgFacility(rs.getObject("lpg_facility") != null ? rs.getLong("lpg_facility") : null)
.lngFacility(rs.getObject("lng_facility") != null ? rs.getLong("lng_facility") : null)
.lngBnkr(rs.getObject("lng_bnkr") != null ? rs.getBoolean("lng_bnkr") : null)
.doBnkr(rs.getObject("do_bnkr") != null ? rs.getBoolean("do_bnkr") : null)
.foBnkr(rs.getObject("fo_bnkr") != null ? rs.getBoolean("fo_bnkr") : null)
.ispsComplianceYn(rs.getObject("isps_compliance_yn") != null ? rs.getBoolean("isps_compliance_yn") : null)
.csiComplianceYn(rs.getObject("csi_compliance_yn") != null ? rs.getBoolean("csi_compliance_yn") : null)
.freeTrdZone(rs.getObject("free_trd_zone") != null ? rs.getBoolean("free_trd_zone") : null)
.ecfrdPort(rs.getObject("ecfrd_port") != null ? rs.getBoolean("ecfrd_port") : null)
.emsnCtrlArea(rs.getObject("emsn_ctrl_area") != null ? rs.getBoolean("emsn_ctrl_area") : null)
.wsPort(rs.getObject("ws_port") != null ? rs.getLong("ws_port") : null)
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toInstant().atZone(ZoneId.systemDefault()) : null)
.regYmd(regYmdTs != null ? regYmdTs.toInstant().atZone(ZoneId.systemDefault()) : null)
.build();
}
}

파일 보기

@ -81,7 +81,7 @@ public class FacilityRepositoryImpl extends MultiDataSourceJdbcRepository<Facili
if (facilityPortEntityList == null || facilityPortEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "FacilityPortEntity", facilityPortEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "FacilityPortEntity", facilityPortEntityList.size());
batchJdbcTemplate.batchUpdate(sql, facilityPortEntityList, facilityPortEntityList.size(),
(ps, entity) -> {
@ -93,7 +93,7 @@ public class FacilityRepositoryImpl extends MultiDataSourceJdbcRepository<Facili
}
});
log.debug("{} 배치 삽입 완료: {} 건", "FacilityPortEntity", facilityPortEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "FacilityPortEntity", facilityPortEntityList.size());
}
public void bindFacilityPort(PreparedStatement pstmt, FacilityPortEntity entity) throws Exception {

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.AnchorageCallDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.AnchorageCallEntity;
@ -95,20 +92,17 @@ public class AnchorageCallSyncJobConfig extends BaseJobConfig<AnchorageCallDto,
@Bean
public BatchWriteListener<AnchorageCallEntity> anchorageCallWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTAnchorageCall);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTAnchorageCall);
}
@Bean(name = "anchorageCallSyncStep")
public Step anchorageCallSyncStep() {
log.info("Step 생성: anchorageCallSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<AnchorageCallDto, AnchorageCallEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<AnchorageCallDto, AnchorageCallEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<AnchorageCallDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(anchorageCallWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.BerthCallDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.BerthCallEntity;
@ -95,20 +92,17 @@ public class BerthCallSyncJobConfig extends BaseJobConfig<BerthCallDto, BerthCal
@Bean
public BatchWriteListener<BerthCallEntity> berthCallWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTBerthCall);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTBerthCall);
}
@Bean(name = "berthCallSyncStep")
public Step berthCallSyncStep() {
log.info("Step 생성: berthCallSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<BerthCallDto, BerthCallEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<BerthCallDto, BerthCallEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<BerthCallDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(berthCallWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.CurrentlyAtDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.CurrentlyAtEntity;
@ -95,20 +92,17 @@ public class CurrentlyAtSyncJobConfig extends BaseJobConfig<CurrentlyAtDto, Curr
@Bean
public BatchWriteListener<CurrentlyAtEntity> currentlyAtWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTCurrentlyAt);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTCurrentlyAt);
}
@Bean(name = "currentlyAtSyncStep")
public Step currentlyAtSyncStep() {
log.info("Step 생성: currentlyAtSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<CurrentlyAtDto, CurrentlyAtEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<CurrentlyAtDto, CurrentlyAtEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<CurrentlyAtDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(currentlyAtWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.DestinationDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.DestinationEntity;
@ -95,20 +92,17 @@ public class DestinationSyncJobConfig extends BaseJobConfig<DestinationDto, Dest
@Bean
public BatchWriteListener<DestinationEntity> destinationWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTDestination);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTDestination);
}
@Bean(name = "destinationSyncStep")
public Step destinationSyncStep() {
log.info("Step 생성: destinationSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<DestinationDto, DestinationEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<DestinationDto, DestinationEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<DestinationDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(destinationWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.PortCallDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.PortCallEntity;
@ -95,20 +92,17 @@ public class PortCallSyncJobConfig extends BaseJobConfig<PortCallDto, PortCallEn
@Bean
public BatchWriteListener<PortCallEntity> portCallWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTShipStpovInfo);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTShipStpovInfo);
}
@Bean(name = "portCallSyncStep")
public Step portCallSyncStep() {
log.info("Step 생성: portCallSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<PortCallDto, PortCallEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<PortCallDto, PortCallEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<PortCallDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(portCallWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.StsOperationDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.StsOperationEntity;
@ -95,20 +92,17 @@ public class StsOperationSyncJobConfig extends BaseJobConfig<StsOperationDto, St
@Bean
public BatchWriteListener<StsOperationEntity> stsOperationWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTStsOperation);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTStsOperation);
}
@Bean(name = "stsOperationSyncStep")
public Step stsOperationSyncStep() {
log.info("Step 생성: stsOperationSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<StsOperationDto, StsOperationEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<StsOperationDto, StsOperationEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<StsOperationDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(stsOperationWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.TerminalCallDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.TerminalCallEntity;
@ -95,20 +92,17 @@ public class TerminalCallSyncJobConfig extends BaseJobConfig<TerminalCallDto, Te
@Bean
public BatchWriteListener<TerminalCallEntity> terminalCallWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTTerminalCall);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTTerminalCall);
}
@Bean(name = "terminalCallSyncStep")
public Step terminalCallSyncStep() {
log.info("Step 생성: terminalCallSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<TerminalCallDto, TerminalCallEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<TerminalCallDto, TerminalCallEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<TerminalCallDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(terminalCallWriteListener())
.build();
}

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.movement.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.TransitDto;
import com.snp.batch.jobs.datasync.batch.movement.entity.TransitEntity;
@ -95,20 +92,17 @@ public class TransitSyncJobConfig extends BaseJobConfig<TransitDto, TransitEntit
@Bean
public BatchWriteListener<TransitEntity> transitWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTTransit);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTTransit);
}
@Bean(name = "transitSyncStep")
public Step transitSyncStep() {
log.info("Step 생성: transitSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<TransitDto, TransitEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<TransitDto, TransitEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<TransitDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(transitWriteListener())
.build();
}

파일 보기

@ -1,85 +1,52 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.AnchorageCallDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.math.BigDecimal;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class AnchorageCallReader implements ItemReader<AnchorageCallDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<AnchorageCallDto> allDataBuffer = new ArrayList<>();
public class AnchorageCallReader extends BaseSyncReader<AnchorageCallDto> {
public AnchorageCallReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public AnchorageCallDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTAnchorageCall;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTAnchorageCall), Long.class);
} catch (Exception e) {
return;
}
@Override
protected AnchorageCallDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
if (nextTargetId != null) {
log.info("[AnchorageCallReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTAnchorageCall);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
return AnchorageCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.prtcllId(rs.getObject("prtcll_id") != null ? rs.getInt("prtcll_id") : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.dest(rs.getString("dest"))
.isoTwoCountryCd(rs.getString("iso_two_country_cd"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTAnchorageCall);
businessJdbcTemplate.update(sql, targetExecutionId);
return AnchorageCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.prtcllId(rs.getObject("prtcll_id") != null ? rs.getInt("prtcll_id") : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.dest(rs.getString("dest"))
.isoTwoCountryCd(rs.getString("iso_two_country_cd"))
.build();
}
}

파일 보기

@ -1,85 +1,53 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.BerthCallDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class BerthCallReader implements ItemReader<BerthCallDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<BerthCallDto> allDataBuffer = new ArrayList<>();
public class BerthCallReader extends BaseSyncReader<BerthCallDto> {
public BerthCallReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public BerthCallDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTBerthCall;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTBerthCall), Long.class);
} catch (Exception e) {
return;
}
@Override
protected BerthCallDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
Timestamp eventStaDtTs = rs.getTimestamp("event_sta_dt");
if (nextTargetId != null) {
log.info("[BerthCallReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTBerthCall);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
Timestamp eventStaDtTs = rs.getTimestamp("event_sta_dt");
return BerthCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.upClotId(rs.getObject("up_clot_id") != null ? rs.getLong("up_clot_id") : null)
.isoTwoCountryCd(rs.getString("iso_two_country_cd"))
.eventStaDt(eventStaDtTs != null ? eventStaDtTs.toLocalDateTime() : null)
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTBerthCall);
businessJdbcTemplate.update(sql, targetExecutionId);
return BerthCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.upClotId(rs.getObject("up_clot_id") != null ? rs.getLong("up_clot_id") : null)
.isoTwoCountryCd(rs.getString("iso_two_country_cd"))
.eventStaDt(eventStaDtTs != null ? eventStaDtTs.toLocalDateTime() : null)
.build();
}
}

파일 보기

@ -1,87 +1,55 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.CurrentlyAtDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class CurrentlyAtReader implements ItemReader<CurrentlyAtDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<CurrentlyAtDto> allDataBuffer = new ArrayList<>();
public class CurrentlyAtReader extends BaseSyncReader<CurrentlyAtDto> {
public CurrentlyAtReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public CurrentlyAtDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTCurrentlyAt;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTCurrentlyAt), Long.class);
} catch (Exception e) {
return;
}
@Override
protected CurrentlyAtDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
if (nextTargetId != null) {
log.info("[CurrentlyAtReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTCurrentlyAt);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
return CurrentlyAtDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.prtcllId(rs.getObject("prtcll_id") != null ? rs.getInt("prtcll_id") : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.dest(rs.getString("dest"))
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.positionInfo(rs.getString("position_info"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTCurrentlyAt);
businessJdbcTemplate.update(sql, targetExecutionId);
return CurrentlyAtDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.prtcllId(rs.getObject("prtcll_id") != null ? rs.getInt("prtcll_id") : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.dest(rs.getString("dest"))
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.positionInfo(rs.getString("position_info"))
.build();
}
}

파일 보기

@ -1,78 +1,46 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.DestinationDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class DestinationReader implements ItemReader<DestinationDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<DestinationDto> allDataBuffer = new ArrayList<>();
public class DestinationReader extends BaseSyncReader<DestinationDto> {
public DestinationReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public DestinationDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTDestination;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTDestination), Long.class);
} catch (Exception e) {
return;
}
@Override
protected DestinationDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
if (nextTargetId != null) {
log.info("[DestinationReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTDestination);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
return DestinationDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTDestination);
businessJdbcTemplate.update(sql, targetExecutionId);
return DestinationDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.build();
}
}

파일 보기

@ -1,87 +1,55 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.PortCallDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class PortCallReader implements ItemReader<PortCallDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<PortCallDto> allDataBuffer = new ArrayList<>();
public class PortCallReader extends BaseSyncReader<PortCallDto> {
public PortCallReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public PortCallDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTShipStpovInfo;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTShipStpovInfo), Long.class);
} catch (Exception e) {
return;
}
@Override
protected PortCallDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
if (nextTargetId != null) {
log.info("[PortCallReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTShipStpovInfo);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
return PortCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.prtcllId(rs.getObject("prtcll_id") != null ? rs.getInt("prtcll_id") : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.dest(rs.getString("dest"))
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.positionInfo(rs.getString("position_info"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTShipStpovInfo);
businessJdbcTemplate.update(sql, targetExecutionId);
return PortCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.prtcllId(rs.getObject("prtcll_id") != null ? rs.getInt("prtcll_id") : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.dest(rs.getString("dest"))
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.positionInfo(rs.getString("position_info"))
.build();
}
}

파일 보기

@ -1,86 +1,54 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.StsOperationDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class StsOperationReader implements ItemReader<StsOperationDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<StsOperationDto> allDataBuffer = new ArrayList<>();
public class StsOperationReader extends BaseSyncReader<StsOperationDto> {
public StsOperationReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public StsOperationDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTStsOperation;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTStsOperation), Long.class);
} catch (Exception e) {
return;
}
@Override
protected StsOperationDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
Timestamp eventStaDtTs = rs.getTimestamp("event_sta_dt");
if (nextTargetId != null) {
log.info("[StsOperationReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTStsOperation);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
Timestamp eventStaDtTs = rs.getTimestamp("event_sta_dt");
return StsOperationDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.upPrtcllId(rs.getObject("up_prtcll_id") != null ? rs.getLong("up_prtcll_id") : null)
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.stsPosition(rs.getString("sts_position"))
.stsType(rs.getString("sts_type"))
.eventStaDt(eventStaDtTs != null ? eventStaDtTs.toLocalDateTime() : null)
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTStsOperation);
businessJdbcTemplate.update(sql, targetExecutionId);
return StsOperationDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.upPrtcllId(rs.getObject("up_prtcll_id") != null ? rs.getLong("up_prtcll_id") : null)
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.stsPosition(rs.getString("sts_position"))
.stsType(rs.getString("sts_type"))
.eventStaDt(eventStaDtTs != null ? eventStaDtTs.toLocalDateTime() : null)
.build();
}
}

파일 보기

@ -1,88 +1,56 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.TerminalCallDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class TerminalCallReader implements ItemReader<TerminalCallDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<TerminalCallDto> allDataBuffer = new ArrayList<>();
public class TerminalCallReader extends BaseSyncReader<TerminalCallDto> {
public TerminalCallReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public TerminalCallDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTTerminalCall;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTTerminalCall), Long.class);
} catch (Exception e) {
return;
}
@Override
protected TerminalCallDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
Timestamp eventStaDtTs = rs.getTimestamp("event_sta_dt");
if (nextTargetId != null) {
log.info("[TerminalCallReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTTerminalCall);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
Timestamp eventStaDtTs = rs.getTimestamp("event_sta_dt");
return TerminalCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.upPrtcllId(rs.getObject("up_prtcll_id") != null ? rs.getLong("up_prtcll_id") : null)
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.eventStaDt(eventStaDtTs != null ? eventStaDtTs.toLocalDateTime() : null)
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTTerminalCall);
businessJdbcTemplate.update(sql, targetExecutionId);
return TerminalCallDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityId(rs.getObject("facility_id") != null ? rs.getInt("facility_id") : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.upFacilityId(rs.getObject("up_facility_id") != null ? rs.getInt("up_facility_id") : null)
.upFacilityNm(rs.getString("up_facility_nm"))
.upFacilityType(rs.getString("up_facility_type"))
.countryCd(rs.getString("country_cd"))
.countryNm(rs.getString("country_nm"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.lat(rs.getObject("lat") != null ? rs.getBigDecimal("lat") : null)
.lon(rs.getObject("lon") != null ? rs.getBigDecimal("lon") : null)
.positionInfo(rs.getString("position_info"))
.upPrtcllId(rs.getObject("up_prtcll_id") != null ? rs.getLong("up_prtcll_id") : null)
.countryIsoTwoCd(rs.getString("country_iso_two_cd"))
.eventStaDt(eventStaDtTs != null ? eventStaDtTs.toLocalDateTime() : null)
.lwrnkFacilityId(rs.getObject("lwrnk_facility_id") != null ? rs.getInt("lwrnk_facility_id") : null)
.lwrnkFacilityDesc(rs.getString("lwrnk_facility_desc"))
.lwrnkFacilityType(rs.getString("lwrnk_facility_type"))
.build();
}
}

파일 보기

@ -1,72 +1,40 @@
package com.snp.batch.jobs.datasync.batch.movement.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.movement.dto.TransitDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class TransitReader implements ItemReader<TransitDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<TransitDto> allDataBuffer = new ArrayList<>();
public class TransitReader extends BaseSyncReader<TransitDto> {
public TransitReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public TransitDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTTransit;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTTransit), Long.class);
} catch (Exception e) {
return;
}
@Override
protected TransitDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
if (nextTargetId != null) {
log.info("[TransitReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTTransit);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp mvmnDtTs = rs.getTimestamp("mvmn_dt");
return TransitDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTTransit);
businessJdbcTemplate.update(sql, targetExecutionId);
return TransitDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.mvmnType(rs.getString("mvmn_type"))
.mvmnDt(mvmnDtTs != null ? mvmnDtTs.toLocalDateTime() : null)
.facilityNm(rs.getString("facility_nm"))
.facilityType(rs.getString("facility_type"))
.draft(rs.getObject("draft") != null ? rs.getBigDecimal("draft") : null)
.build();
}
}

파일 보기

@ -80,7 +80,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (anchorageCallEntityList == null || anchorageCallEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "AnchorageCallEntity", anchorageCallEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "AnchorageCallEntity", anchorageCallEntityList.size());
batchJdbcTemplate.batchUpdate(sql, anchorageCallEntityList, anchorageCallEntityList.size(),
(ps, entity) -> {
@ -92,7 +92,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "AnchorageCallEntity", anchorageCallEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "AnchorageCallEntity", anchorageCallEntityList.size());
}
public void bindAnchorageCall(PreparedStatement pstmt, AnchorageCallEntity entity) throws Exception {
@ -125,7 +125,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (berthCallEntityList == null || berthCallEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "BerthCallEntity", berthCallEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "BerthCallEntity", berthCallEntityList.size());
batchJdbcTemplate.batchUpdate(sql, berthCallEntityList, berthCallEntityList.size(),
(ps, entity) -> {
@ -137,7 +137,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "BerthCallEntity", berthCallEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "BerthCallEntity", berthCallEntityList.size());
}
public void bindBerthCall(PreparedStatement pstmt, BerthCallEntity entity) throws Exception {
@ -170,7 +170,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (currentlyAtEntityList == null || currentlyAtEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "CurrentlyAtEntity", currentlyAtEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "CurrentlyAtEntity", currentlyAtEntityList.size());
batchJdbcTemplate.batchUpdate(sql, currentlyAtEntityList, currentlyAtEntityList.size(),
(ps, entity) -> {
@ -182,7 +182,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "CurrentlyAtEntity", currentlyAtEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "CurrentlyAtEntity", currentlyAtEntityList.size());
}
public void bindCurrentlyAt(PreparedStatement pstmt, CurrentlyAtEntity entity) throws Exception {
@ -218,7 +218,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (destinationEntityList == null || destinationEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "DestinationEntity", destinationEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "DestinationEntity", destinationEntityList.size());
batchJdbcTemplate.batchUpdate(sql, destinationEntityList, destinationEntityList.size(),
(ps, entity) -> {
@ -230,7 +230,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "DestinationEntity", destinationEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "DestinationEntity", destinationEntityList.size());
}
public void bindDestination(PreparedStatement pstmt, DestinationEntity entity) throws Exception {
@ -257,7 +257,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (portCallEntityList == null || portCallEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "PortCallEntity", portCallEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "PortCallEntity", portCallEntityList.size());
batchJdbcTemplate.batchUpdate(sql, portCallEntityList, portCallEntityList.size(),
(ps, entity) -> {
@ -269,7 +269,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "PortCallEntity", portCallEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "PortCallEntity", portCallEntityList.size());
}
public void bindPortCall(PreparedStatement pstmt, PortCallEntity entity) throws Exception {
@ -305,7 +305,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (stsOperationEntityList == null || stsOperationEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "StsOperationEntity", stsOperationEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "StsOperationEntity", stsOperationEntityList.size());
batchJdbcTemplate.batchUpdate(sql, stsOperationEntityList, stsOperationEntityList.size(),
(ps, entity) -> {
@ -317,7 +317,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "StsOperationEntity", stsOperationEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "StsOperationEntity", stsOperationEntityList.size());
}
public void bindStsOperation(PreparedStatement pstmt, StsOperationEntity entity) throws Exception {
@ -351,7 +351,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (terminalCallEntityList == null || terminalCallEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "TerminalCallEntity", terminalCallEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "TerminalCallEntity", terminalCallEntityList.size());
batchJdbcTemplate.batchUpdate(sql, terminalCallEntityList, terminalCallEntityList.size(),
(ps, entity) -> {
@ -363,7 +363,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "TerminalCallEntity", terminalCallEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "TerminalCallEntity", terminalCallEntityList.size());
}
public void bindTerminalCall(PreparedStatement pstmt, TerminalCallEntity entity) throws Exception {
@ -399,7 +399,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
if (transitEntityList == null || transitEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "TransitEntity", transitEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "TransitEntity", transitEntityList.size());
batchJdbcTemplate.batchUpdate(sql, transitEntityList, transitEntityList.size(),
(ps, entity) -> {
@ -411,7 +411,7 @@ public class MovementRepositoryImpl extends MultiDataSourceJdbcRepository<Anchor
}
});
log.debug("{} 배치 삽입 완료: {} 건", "TransitEntity", transitEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "TransitEntity", transitEntityList.size());
}
public void bindTransit(PreparedStatement pstmt, TransitEntity entity) throws Exception {

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.psc.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.psc.dto.PscAllCertificateDto;
import com.snp.batch.jobs.datasync.batch.psc.dto.PscDefectDto;
@ -132,20 +129,17 @@ public class PscSyncJobConfig extends BaseJobConfig<PscDetailDto, PscDetailEntit
@Bean
public BatchWriteListener<PscDetailEntity> pscDetailWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourcePscDetail);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourcePscDetail);
}
@Bean
public BatchWriteListener<PscDefectEntity> pscDefectWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourcePscDefect);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourcePscDefect);
}
@Bean
public BatchWriteListener<PscAllCertificateEntity> pscAllCertificateWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourcePscAllCertificate);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourcePscAllCertificate);
}
// --- Steps ---
@ -154,12 +148,10 @@ public class PscSyncJobConfig extends BaseJobConfig<PscDetailDto, PscDetailEntit
public Step pscDetailSyncStep() {
log.info("Step 생성: pscDetailSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<PscDetailDto, PscDetailEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<PscDetailDto, PscDetailEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<PscDetailDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(pscDetailWriteListener())
.build();
}
@ -168,12 +160,10 @@ public class PscSyncJobConfig extends BaseJobConfig<PscDetailDto, PscDetailEntit
public Step pscDefectSyncStep() {
log.info("Step 생성: pscDefectSyncStep");
return new StepBuilder("pscDefectSyncStep", jobRepository)
.<PscDefectDto, PscDefectEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<PscDefectDto, PscDefectEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(pscDefectReader(businessDataSource, tableMetaInfo))
.processor(new PscDefectProcessor())
.writer(new PscDefectWriter(pscRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<PscDefectDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(pscDefectWriteListener())
.build();
}
@ -182,12 +172,10 @@ public class PscSyncJobConfig extends BaseJobConfig<PscDetailDto, PscDetailEntit
public Step pscAllCertificateSyncStep() {
log.info("Step 생성: pscAllCertificateSyncStep");
return new StepBuilder("pscAllCertificateSyncStep", jobRepository)
.<PscAllCertificateDto, PscAllCertificateEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<PscAllCertificateDto, PscAllCertificateEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(pscAllCertificateReader(businessDataSource, tableMetaInfo))
.processor(new PscAllCertificateProcessor())
.writer(new PscAllCertificateWriter(pscRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<PscAllCertificateDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(pscAllCertificateWriteListener())
.build();
}

파일 보기

@ -1,87 +1,55 @@
package com.snp.batch.jobs.datasync.batch.psc.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.psc.dto.PscAllCertificateDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class PscAllCertificateReader implements ItemReader<PscAllCertificateDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<PscAllCertificateDto> allDataBuffer = new ArrayList<>();
public class PscAllCertificateReader extends BaseSyncReader<PscAllCertificateDto> {
public PscAllCertificateReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public PscAllCertificateDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourcePscAllCertificate;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourcePscAllCertificate), Long.class);
} catch (Exception e) {
return;
}
@Override
protected PscAllCertificateDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp expryYmdTs = rs.getTimestamp("expry_ymd");
Timestamp lastInspectionYmdTs = rs.getTimestamp("last_inspection_ymd");
if (nextTargetId != null) {
log.info("[PscAllCertificateReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourcePscAllCertificate);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp expryYmdTs = rs.getTimestamp("expry_ymd");
Timestamp lastInspectionYmdTs = rs.getTimestamp("last_inspection_ymd");
return PscAllCertificateDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.certId(rs.getString("cert_id"))
.inspectionId(rs.getString("inspection_id"))
.imoNo(rs.getString("imo_no"))
.certfNmCd(rs.getString("certf_nm_cd"))
.certfNm(rs.getString("certf_nm"))
.issueEnginesCd(rs.getString("issue_engines_cd"))
.issueEngines(rs.getString("issue_engines"))
.etcIssueEngines(rs.getString("etc_issue_engines"))
.issueYmd(rs.getString("issue_ymd"))
.expryYmd(expryYmdTs != null ? expryYmdTs.toLocalDateTime() : null)
.lastInspectionYmd(lastInspectionYmdTs != null ? lastInspectionYmdTs.toLocalDateTime() : null)
.inspectionEnginesCd(rs.getString("inspection_engines_cd"))
.inspectionEngines(rs.getString("inspection_engines"))
.etcInspectionEngines(rs.getString("etc_inspection_engines"))
.recentInspectionPlc(rs.getString("recent_inspection_plc"))
.recentInspectionPlcCd(rs.getString("recent_inspection_plc_cd"))
.inspectionEnginesType(rs.getString("inspection_engines_type"))
.checkYmd(rs.getString("check_ymd"))
.insptr(rs.getString("insptr"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourcePscAllCertificate);
businessJdbcTemplate.update(sql, targetExecutionId);
return PscAllCertificateDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.certId(rs.getString("cert_id"))
.inspectionId(rs.getString("inspection_id"))
.imoNo(rs.getString("imo_no"))
.certfNmCd(rs.getString("certf_nm_cd"))
.certfNm(rs.getString("certf_nm"))
.issueEnginesCd(rs.getString("issue_engines_cd"))
.issueEngines(rs.getString("issue_engines"))
.etcIssueEngines(rs.getString("etc_issue_engines"))
.issueYmd(rs.getString("issue_ymd"))
.expryYmd(expryYmdTs != null ? expryYmdTs.toLocalDateTime() : null)
.lastInspectionYmd(lastInspectionYmdTs != null ? lastInspectionYmdTs.toLocalDateTime() : null)
.inspectionEnginesCd(rs.getString("inspection_engines_cd"))
.inspectionEngines(rs.getString("inspection_engines"))
.etcInspectionEngines(rs.getString("etc_inspection_engines"))
.recentInspectionPlc(rs.getString("recent_inspection_plc"))
.recentInspectionPlcCd(rs.getString("recent_inspection_plc_cd"))
.inspectionEnginesType(rs.getString("inspection_engines_type"))
.checkYmd(rs.getString("check_ymd"))
.insptr(rs.getString("insptr"))
.build();
}
}

파일 보기

@ -1,87 +1,55 @@
package com.snp.batch.jobs.datasync.batch.psc.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.psc.dto.PscDefectDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class PscDefectReader implements ItemReader<PscDefectDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<PscDefectDto> allDataBuffer = new ArrayList<>();
public class PscDefectReader extends BaseSyncReader<PscDefectDto> {
public PscDefectReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public PscDefectDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourcePscDefect;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourcePscDefect), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[PscDefectReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourcePscDefect);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return PscDefectDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.defectId(rs.getString("defect_id"))
.inspectionId(rs.getString("inspection_id"))
.actnOne(rs.getString("actn_one"))
.actnTwo(rs.getString("actn_two"))
.actnThr(rs.getString("actn_thr"))
.actnCdOne(rs.getString("actn_cd_one"))
.actnCdTwo(rs.getString("actn_cd_two"))
.actnCdThr(rs.getString("actn_cd_thr"))
.clficRespsbYn(rs.getString("clfic_respsb_yn"))
.defectCd(rs.getString("defect_cd"))
.defectCn(rs.getString("defect_cn"))
.defectIemCd(rs.getString("defect_iem_cd"))
.detainedReasonDefect(rs.getString("detained_reason_defect"))
.mainDefectCd(rs.getString("main_defect_cd"))
.mainDefectCn(rs.getString("main_defect_cn"))
.defectTypeCd(rs.getString("defect_type_cd"))
.defectTypeNm(rs.getString("defect_type_nm"))
.etcActn(rs.getString("etc_actn"))
.etcPubcEnginesRespsb(rs.getString("etc_pubc_engines_respsb"))
.pubcEnginesRespsb(rs.getString("pubc_engines_respsb"))
.pubcEnginesRespsbCd(rs.getString("pubc_engines_respsb_cd"))
.pubcEnginesRespsbYn(rs.getString("pubc_engines_respsb_yn"))
.acdntDamgYn(rs.getString("acdnt_damg_yn"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourcePscDefect);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected PscDefectDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return PscDefectDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.defectId(rs.getString("defect_id"))
.inspectionId(rs.getString("inspection_id"))
.actnOne(rs.getString("actn_one"))
.actnTwo(rs.getString("actn_two"))
.actnThr(rs.getString("actn_thr"))
.actnCdOne(rs.getString("actn_cd_one"))
.actnCdTwo(rs.getString("actn_cd_two"))
.actnCdThr(rs.getString("actn_cd_thr"))
.clficRespsbYn(rs.getString("clfic_respsb_yn"))
.defectCd(rs.getString("defect_cd"))
.defectCn(rs.getString("defect_cn"))
.defectIemCd(rs.getString("defect_iem_cd"))
.detainedReasonDefect(rs.getString("detained_reason_defect"))
.mainDefectCd(rs.getString("main_defect_cd"))
.mainDefectCn(rs.getString("main_defect_cn"))
.defectTypeCd(rs.getString("defect_type_cd"))
.defectTypeNm(rs.getString("defect_type_nm"))
.etcActn(rs.getString("etc_actn"))
.etcPubcEnginesRespsb(rs.getString("etc_pubc_engines_respsb"))
.pubcEnginesRespsb(rs.getString("pubc_engines_respsb"))
.pubcEnginesRespsbCd(rs.getString("pubc_engines_respsb_cd"))
.pubcEnginesRespsbYn(rs.getString("pubc_engines_respsb_yn"))
.acdntDamgYn(rs.getString("acdnt_damg_yn"))
.build();
}
}

파일 보기

@ -1,98 +1,66 @@
package com.snp.batch.jobs.datasync.batch.psc.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.psc.dto.PscDetailDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class PscDetailReader implements ItemReader<PscDetailDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<PscDetailDto> allDataBuffer = new ArrayList<>();
public class PscDetailReader extends BaseSyncReader<PscDetailDto> {
public PscDetailReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public PscDetailDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourcePscDetail;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourcePscDetail), Long.class);
} catch (Exception e) {
return;
}
@Override
protected PscDetailDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp inspectionYmdTs = rs.getTimestamp("inspection_ymd");
Timestamp tkoffPrmtYmdTs = rs.getTimestamp("tkoff_prmt_ymd");
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
if (nextTargetId != null) {
log.info("[PscDetailReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourcePscDetail);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp inspectionYmdTs = rs.getTimestamp("inspection_ymd");
Timestamp tkoffPrmtYmdTs = rs.getTimestamp("tkoff_prmt_ymd");
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
return PscDetailDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.inspectionId(rs.getString("inspection_id"))
.typeId(rs.getString("type_id"))
.clsgnNo(rs.getString("clsgn_no"))
.chrter(rs.getString("chrter"))
.clfic(rs.getString("clfic"))
.country(rs.getString("country"))
.inspectionYmd(inspectionYmdTs != null ? inspectionYmdTs.toLocalDateTime() : null)
.tkoffPrmtYmd(tkoffPrmtYmdTs != null ? tkoffPrmtYmdTs.toLocalDateTime() : null)
.shipDetainedYn(rs.getString("ship_detained_yn"))
.dwt(rs.getString("dwt"))
.expndInspectionYn(rs.getString("expnd_inspection_yn"))
.flg(rs.getString("flg"))
.folwInspectionYn(rs.getString("folw_inspection_yn"))
.gt(rs.getString("gt"))
.inspectionPortNm(rs.getString("inspection_port_nm"))
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toLocalDateTime() : null)
.shipMngr(rs.getString("ship_mngr"))
.detainedDays(rs.getObject("detained_days") != null ? rs.getInt("detained_days") : null)
.defectCnt(rs.getString("defect_cnt"))
.defectCntDays(rs.getBigDecimal("defect_cnt_days"))
.etcInspectionType(rs.getString("etc_inspection_type"))
.shponr(rs.getString("shponr"))
.shipNm(rs.getString("ship_nm"))
.shipTypeCd(rs.getString("ship_type_cd"))
.shipTypeNm(rs.getString("ship_type_nm"))
.dataSrc(rs.getString("data_src"))
.unPortCd(rs.getString("un_port_cd"))
.buildYy(rs.getString("build_yy"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourcePscDetail);
businessJdbcTemplate.update(sql, targetExecutionId);
return PscDetailDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.inspectionId(rs.getString("inspection_id"))
.typeId(rs.getString("type_id"))
.clsgnNo(rs.getString("clsgn_no"))
.chrter(rs.getString("chrter"))
.clfic(rs.getString("clfic"))
.country(rs.getString("country"))
.inspectionYmd(inspectionYmdTs != null ? inspectionYmdTs.toLocalDateTime() : null)
.tkoffPrmtYmd(tkoffPrmtYmdTs != null ? tkoffPrmtYmdTs.toLocalDateTime() : null)
.shipDetainedYn(rs.getString("ship_detained_yn"))
.dwt(rs.getString("dwt"))
.expndInspectionYn(rs.getString("expnd_inspection_yn"))
.flg(rs.getString("flg"))
.folwInspectionYn(rs.getString("folw_inspection_yn"))
.gt(rs.getString("gt"))
.inspectionPortNm(rs.getString("inspection_port_nm"))
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toLocalDateTime() : null)
.shipMngr(rs.getString("ship_mngr"))
.detainedDays(rs.getObject("detained_days") != null ? rs.getInt("detained_days") : null)
.defectCnt(rs.getString("defect_cnt"))
.defectCntDays(rs.getBigDecimal("defect_cnt_days"))
.etcInspectionType(rs.getString("etc_inspection_type"))
.shponr(rs.getString("shponr"))
.shipNm(rs.getString("ship_nm"))
.shipTypeCd(rs.getString("ship_type_cd"))
.shipTypeNm(rs.getString("ship_type_nm"))
.dataSrc(rs.getString("data_src"))
.unPortCd(rs.getString("un_port_cd"))
.buildYy(rs.getString("build_yy"))
.build();
}
}

파일 보기

@ -83,7 +83,7 @@ public class PscRepositoryImpl extends MultiDataSourceJdbcRepository<PscDetailEn
if (pscDetailEntityList == null || pscDetailEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "PscDetailEntity", pscDetailEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "PscDetailEntity", pscDetailEntityList.size());
batchJdbcTemplate.batchUpdate(sql, pscDetailEntityList, pscDetailEntityList.size(),
(ps, entity) -> {
@ -95,7 +95,7 @@ public class PscRepositoryImpl extends MultiDataSourceJdbcRepository<PscDetailEn
}
});
log.debug("{} 배치 삽입 완료: {} 건", "PscDetailEntity", pscDetailEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "PscDetailEntity", pscDetailEntityList.size());
}
public void bindPscDetail(PreparedStatement pstmt, PscDetailEntity entity) throws Exception {
@ -139,7 +139,7 @@ public class PscRepositoryImpl extends MultiDataSourceJdbcRepository<PscDetailEn
if (pscDefectEntityList == null || pscDefectEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "PscDefectEntity", pscDefectEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "PscDefectEntity", pscDefectEntityList.size());
batchJdbcTemplate.batchUpdate(sql, pscDefectEntityList, pscDefectEntityList.size(),
(ps, entity) -> {
@ -151,7 +151,7 @@ public class PscRepositoryImpl extends MultiDataSourceJdbcRepository<PscDetailEn
}
});
log.debug("{} 배치 삽입 완료: {} 건", "PscDefectEntity", pscDefectEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "PscDefectEntity", pscDefectEntityList.size());
}
public void bindPscDefect(PreparedStatement pstmt, PscDefectEntity entity) throws Exception {
@ -189,7 +189,7 @@ public class PscRepositoryImpl extends MultiDataSourceJdbcRepository<PscDetailEn
if (pscAllCertificateEntityList == null || pscAllCertificateEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "PscAllCertificateEntity", pscAllCertificateEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "PscAllCertificateEntity", pscAllCertificateEntityList.size());
batchJdbcTemplate.batchUpdate(sql, pscAllCertificateEntityList, pscAllCertificateEntityList.size(),
(ps, entity) -> {
@ -201,7 +201,7 @@ public class PscRepositoryImpl extends MultiDataSourceJdbcRepository<PscDetailEn
}
});
log.debug("{} 배치 삽입 완료: {} 건", "PscAllCertificateEntity", pscAllCertificateEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "PscAllCertificateEntity", pscAllCertificateEntityList.size());
}
public void bindPscAllCertificate(PreparedStatement pstmt, PscAllCertificateEntity entity) throws Exception {

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.risk.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.risk.dto.RiskDto;
import com.snp.batch.jobs.datasync.batch.risk.entity.RiskEntity;
@ -102,8 +99,7 @@ public class RiskSyncJobConfig extends BaseJobConfig<RiskDto, RiskEntity> {
@Bean
public BatchWriteListener<RiskEntity> riskWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceRisk);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceRisk);
}
// --- Steps ---
@ -112,12 +108,10 @@ public class RiskSyncJobConfig extends BaseJobConfig<RiskDto, RiskEntity> {
public Step riskSyncStep() {
log.info("Step 생성: riskSyncStep");
return new StepBuilder(getStepName(), jobRepository)
.<RiskDto, RiskEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<RiskDto, RiskEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<RiskDto>())
.listener(new GroupByExecutionIdChunkListener())
.listener(riskWriteListener())
.build();
}

파일 보기

@ -1,108 +1,76 @@
package com.snp.batch.jobs.datasync.batch.risk.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.risk.dto.RiskDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.List;
@Slf4j
public class RiskReader implements ItemReader<RiskDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<RiskDto> allDataBuffer = new ArrayList<>();
public class RiskReader extends BaseSyncReader<RiskDto> {
public RiskReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public RiskDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceRisk;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceRisk), Long.class);
} catch (Exception e) {
return;
}
@Override
protected RiskDto mapRow(ResultSet rs, Long targetId) throws SQLException {
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
if (nextTargetId != null) {
log.info("[RiskReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceRisk);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
Timestamp lastMdfcnDtTs = rs.getTimestamp("last_mdfcn_dt");
return RiskDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toLocalDateTime() : null)
.riskDataMaint(rs.getString("risk_data_maint"))
.aisNotrcvElpsDays(rs.getString("ais_notrcv_elps_days"))
.aisLwrnkDays(rs.getString("ais_lwrnk_days"))
.aisUpImoDesc(rs.getString("ais_up_imo_desc"))
.othrShipNmVoyYn(rs.getString("othr_ship_nm_voy_yn"))
.mmsiAnomMessage(rs.getString("mmsi_anom_message"))
.recentDarkActv(rs.getString("recent_dark_actv"))
.portPrtcll(rs.getString("port_prtcll"))
.portRisk(rs.getString("port_risk"))
.stsJob(rs.getString("sts_job"))
.driftChg(rs.getString("drift_chg"))
.riskEvent(rs.getString("risk_event"))
.ntnltyChg(rs.getString("ntnlty_chg"))
.ntnltyPrsMouPerf(rs.getString("ntnlty_prs_mou_perf"))
.ntnltyTkyMouPerf(rs.getString("ntnlty_tky_mou_perf"))
.ntnltyUscgMouPerf(rs.getString("ntnlty_uscg_mou_perf"))
.uscgExclShipCert(rs.getString("uscg_excl_ship_cert"))
.pscInspectionElpsHr(rs.getString("psc_inspection_elps_hr"))
.pscInspection(rs.getString("psc_inspection"))
.pscDefect(rs.getString("psc_defect"))
.pscDetained(rs.getString("psc_detained"))
.nowSmgrcEvdc(rs.getString("now_smgrc_evdc"))
.doccChg(rs.getString("docc_chg"))
.nowClfic(rs.getString("now_clfic"))
.clficStatusChg(rs.getString("clfic_status_chg"))
.pniInsrnc(rs.getString("pni_insrnc"))
.shipNmChg(rs.getString("ship_nm_chg"))
.gboChg(rs.getString("gbo_chg"))
.vslage(rs.getString("vslage"))
.ilglFshrViol(rs.getString("ilgl_fshr_viol"))
.draftChg(rs.getString("draft_chg"))
.recentSanctionPrtcll(rs.getString("recent_sanction_prtcll"))
.snglShipVoy(rs.getString("sngl_ship_voy"))
.fltsfty(rs.getString("fltsfty"))
.fltPsc(rs.getString("flt_psc"))
.spcInspectionOvdue(rs.getString("spc_inspection_ovdue"))
.ownrUnk(rs.getString("ownr_unk"))
.rssPortCall(rs.getString("rss_port_call"))
.rssOwnrReg(rs.getString("rss_ownr_reg"))
.rssSts(rs.getString("rss_sts"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceRisk);
businessJdbcTemplate.update(sql, targetExecutionId);
return RiskDto.builder()
.jobExecutionId(targetId)
.imoNo(rs.getString("imo_no"))
.lastMdfcnDt(lastMdfcnDtTs != null ? lastMdfcnDtTs.toLocalDateTime() : null)
.riskDataMaint(rs.getString("risk_data_maint"))
.aisNotrcvElpsDays(rs.getString("ais_notrcv_elps_days"))
.aisLwrnkDays(rs.getString("ais_lwrnk_days"))
.aisUpImoDesc(rs.getString("ais_up_imo_desc"))
.othrShipNmVoyYn(rs.getString("othr_ship_nm_voy_yn"))
.mmsiAnomMessage(rs.getString("mmsi_anom_message"))
.recentDarkActv(rs.getString("recent_dark_actv"))
.portPrtcll(rs.getString("port_prtcll"))
.portRisk(rs.getString("port_risk"))
.stsJob(rs.getString("sts_job"))
.driftChg(rs.getString("drift_chg"))
.riskEvent(rs.getString("risk_event"))
.ntnltyChg(rs.getString("ntnlty_chg"))
.ntnltyPrsMouPerf(rs.getString("ntnlty_prs_mou_perf"))
.ntnltyTkyMouPerf(rs.getString("ntnlty_tky_mou_perf"))
.ntnltyUscgMouPerf(rs.getString("ntnlty_uscg_mou_perf"))
.uscgExclShipCert(rs.getString("uscg_excl_ship_cert"))
.pscInspectionElpsHr(rs.getString("psc_inspection_elps_hr"))
.pscInspection(rs.getString("psc_inspection"))
.pscDefect(rs.getString("psc_defect"))
.pscDetained(rs.getString("psc_detained"))
.nowSmgrcEvdc(rs.getString("now_smgrc_evdc"))
.doccChg(rs.getString("docc_chg"))
.nowClfic(rs.getString("now_clfic"))
.clficStatusChg(rs.getString("clfic_status_chg"))
.pniInsrnc(rs.getString("pni_insrnc"))
.shipNmChg(rs.getString("ship_nm_chg"))
.gboChg(rs.getString("gbo_chg"))
.vslage(rs.getString("vslage"))
.ilglFshrViol(rs.getString("ilgl_fshr_viol"))
.draftChg(rs.getString("draft_chg"))
.recentSanctionPrtcll(rs.getString("recent_sanction_prtcll"))
.snglShipVoy(rs.getString("sngl_ship_voy"))
.fltsfty(rs.getString("fltsfty"))
.fltPsc(rs.getString("flt_psc"))
.spcInspectionOvdue(rs.getString("spc_inspection_ovdue"))
.ownrUnk(rs.getString("ownr_unk"))
.rssPortCall(rs.getString("rss_port_call"))
.rssOwnrReg(rs.getString("rss_ownr_reg"))
.rssSts(rs.getString("rss_sts"))
.build();
}
}

파일 보기

@ -80,7 +80,7 @@ public class RiskRepositoryImpl extends MultiDataSourceJdbcRepository<RiskEntity
if (riskEntityList == null || riskEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "RiskEntity", riskEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "RiskEntity", riskEntityList.size());
batchJdbcTemplate.batchUpdate(sql, riskEntityList, riskEntityList.size(),
(ps, entity) -> {
@ -92,7 +92,7 @@ public class RiskRepositoryImpl extends MultiDataSourceJdbcRepository<RiskEntity
}
});
log.debug("{} 배치 삽입 완료: {} 건", "RiskEntity", riskEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "RiskEntity", riskEntityList.size());
}
@Override
@ -101,7 +101,7 @@ public class RiskRepositoryImpl extends MultiDataSourceJdbcRepository<RiskEntity
if (riskEntityList == null || riskEntityList.isEmpty()) {
return;
}
log.debug("{} 배치 삽입 시작: {} 건", "RiskEntity", riskEntityList.size());
// log.debug("{} 배치 삽입 시작: {} 건", "RiskEntity", riskEntityList.size());
batchJdbcTemplate.batchUpdate(sql, riskEntityList, riskEntityList.size(),
(ps, entity) -> {
@ -113,7 +113,7 @@ public class RiskRepositoryImpl extends MultiDataSourceJdbcRepository<RiskEntity
}
});
log.debug("{} 배치 삽입 완료: {} 건", "RiskEntity", riskEntityList.size());
// log.debug("{} 배치 삽입 완료: {} 건", "RiskEntity", riskEntityList.size());
}
public void bindRisk(PreparedStatement pstmt, RiskEntity entity) throws Exception {

파일 보기

@ -2,10 +2,7 @@ package com.snp.batch.jobs.datasync.batch.ship.config;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.common.util.BatchWriteListener;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.util.GroupByExecutionIdChunkListener;
import com.snp.batch.common.util.GroupByExecutionIdPolicy;
import com.snp.batch.common.util.GroupByExecutionIdReadListener;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.BareboatCharterHistoryDto;
import com.snp.batch.jobs.datasync.batch.ship.dto.CallsignAndMmsiHistoryDto;
@ -475,158 +472,132 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
// --- Listeners ---
@Bean
public BatchWriteListener<ShipInfoMstEntity> shipWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceShipDetailData);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceShipDetailData);
}
@Bean
public BatchWriteListener<OwnerHistoryEntity> ownerHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceOwnerHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceOwnerHistory);
}
@Bean
public BatchWriteListener<ShipAddInfoEntity> shipAddInfoWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceAdditionalShipsData);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceAdditionalShipsData);
}
@Bean
public BatchWriteListener<BareboatCharterHistoryEntity> bareboatCharterHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceBareboatCharterHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceBareboatCharterHistory);
}
@Bean
public BatchWriteListener<CallsignAndMmsiHistoryEntity> callsignAndMmsiHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceCallsignAndMmsiHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceCallsignAndMmsiHistory);
}
@Bean
public BatchWriteListener<ClassHistoryEntity> classHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceClassHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceClassHistory);
}
@Bean
public BatchWriteListener<CompanyVesselRelationshipsEntity> companyVesselRelationshipsWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceCompanyVesselRelationships);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceCompanyVesselRelationships);
}
@Bean
public BatchWriteListener<CrewListEntity> crewListWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceCrewList);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceCrewList);
}
@Bean
public BatchWriteListener<DarkActivityConfirmedEntity> darkActivityConfirmedWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceDarkActivityConfirmed);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceDarkActivityConfirmed);
}
@Bean
public BatchWriteListener<FlagHistoryEntity> flagHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceFlagHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceFlagHistory);
}
@Bean
public BatchWriteListener<GroupBeneficialOwnerHistoryEntity> groupBeneficialOwnerHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceGroupBeneficialOwnerHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceGroupBeneficialOwnerHistory);
}
@Bean
public BatchWriteListener<IceClassEntity> iceClassWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceIceClass);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceIceClass);
}
@Bean
public BatchWriteListener<NameHistoryEntity> nameHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceNameHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceNameHistory);
}
@Bean
public BatchWriteListener<OperatorHistoryEntity> operatorHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceOperatorHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceOperatorHistory);
}
@Bean
public BatchWriteListener<PandIHistoryEntity> pandIHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourcePandiHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourcePandiHistory);
}
@Bean
public BatchWriteListener<SafetyManagementCertificateHistEntity> safetyManagementCertificateHistWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceSafetyManagementCertificateHist);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceSafetyManagementCertificateHist);
}
@Bean
public BatchWriteListener<ShipManagerHistoryEntity> shipManagerHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceShipManagerHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceShipManagerHistory);
}
@Bean
public BatchWriteListener<SisterShipLinksEntity> sisterShipLinksWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceSisterShipLinks);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceSisterShipLinks);
}
@Bean
public BatchWriteListener<SpecialFeatureEntity> specialFeatureWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceSpecialFeature);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceSpecialFeature);
}
@Bean
public BatchWriteListener<StatusHistoryEntity> statusHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceStatusHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceStatusHistory);
}
@Bean
public BatchWriteListener<StowageCommodityEntity> stowageCommodityWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceStowageCommodity);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceStowageCommodity);
}
@Bean
public BatchWriteListener<SurveyDatesEntity> surveyDatesWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceSurveyDates);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceSurveyDates);
}
@Bean
public BatchWriteListener<SurveyDatesHistoryUniqueEntity> surveyDatesHistoryUniqueWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceSurveyDatesHistoryUnique);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceSurveyDatesHistoryUnique);
}
@Bean
public BatchWriteListener<TechnicalManagerHistoryEntity> technicalManagerHistoryWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTechnicalManagerHistory);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTechnicalManagerHistory);
}
@Bean
public BatchWriteListener<ThrustersEntity> thrustersWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceThrusters);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceThrusters);
}
@Bean
public BatchWriteListener<TbCompanyDetailEntity> tbCompanyDetailWriteListener() {
String sql = CommonSql.getCompleteBatchQuery(tableMetaInfo.sourceTbCompanyDetail);
return new BatchWriteListener<>(businessJdbcTemplate, sql);
return new BatchWriteListener<>(businessJdbcTemplate, tableMetaInfo.sourceTbCompanyDetail);
}
// --- Steps ---
@ -634,12 +605,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
@Bean(name = "snpShipDetailSyncStep")
public Step snpShipDetailSyncStep() {
return new StepBuilder(getStepName(), jobRepository)
.<ShipInfoMstDto, ShipInfoMstEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<ShipInfoMstDto, ShipInfoMstEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(createReader())
.processor(createProcessor())
.writer(createWriter())
.listener(new GroupByExecutionIdReadListener<ShipInfoMstDto>()) // Reader 리스너 (ThreadLocal 설정)
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너 (ThreadLocal 정리)
.listener(shipWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -648,12 +617,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step ownerHistorySyncStep() {
log.info("Step 생성: ownerHistorySyncStep");
return new StepBuilder("ownerHistorySyncStep", jobRepository)
.<OwnerHistoryDto, OwnerHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<OwnerHistoryDto, OwnerHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(ownerHistoryReader(businessDataSource, tableMetaInfo))
.processor(new OwnerHistoryProcessor())
.writer(new OwnerHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<OwnerHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(ownerHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -662,12 +629,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step shipAddInfoSyncStep() {
log.info("Step 생성: shipAddInfoSyncStep");
return new StepBuilder("shipAddInfoSyncStep", jobRepository)
.<ShipAddInfoDto, ShipAddInfoEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<ShipAddInfoDto, ShipAddInfoEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(shipAddInfoReader(businessDataSource, tableMetaInfo))
.processor(new ShipAddInfoProcessor())
.writer(new ShipAddInfoWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<ShipAddInfoDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(shipAddInfoWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -676,12 +641,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step bareboatCharterHistorySyncStep() {
log.info("Step 생성: bareboatCharterHistorySyncStep");
return new StepBuilder("bareboatCharterHistorySyncStep", jobRepository)
.<BareboatCharterHistoryDto, BareboatCharterHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<BareboatCharterHistoryDto, BareboatCharterHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(bareboatCharterHistoryReader(businessDataSource, tableMetaInfo))
.processor(new BareboatCharterHistoryProcessor())
.writer(new BareboatCharterHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<BareboatCharterHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(bareboatCharterHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -690,12 +653,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step callsignAndMmsiHistorySyncStep() {
log.info("Step 생성: callsignAndMmsiHistorySyncStep");
return new StepBuilder("callsignAndMmsiHistorySyncStep", jobRepository)
.<CallsignAndMmsiHistoryDto, CallsignAndMmsiHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<CallsignAndMmsiHistoryDto, CallsignAndMmsiHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(callsignAndMmsiHistoryReader(businessDataSource, tableMetaInfo))
.processor(new CallsignAndMmsiHistoryProcessor())
.writer(new CallsignAndMmsiHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<CallsignAndMmsiHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(callsignAndMmsiHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -704,12 +665,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step classHistorySyncStep() {
log.info("Step 생성: classHistorySyncStep");
return new StepBuilder("classHistorySyncStep", jobRepository)
.<ClassHistoryDto, ClassHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<ClassHistoryDto, ClassHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(classHistoryReader(businessDataSource, tableMetaInfo))
.processor(new ClassHistoryProcessor())
.writer(new ClassHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<ClassHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(classHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -718,12 +677,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step companyVesselRelationshipsSyncStep() {
log.info("Step 생성: companyVesselRelationshipsSyncStep");
return new StepBuilder("companyVesselRelationshipsSyncStep", jobRepository)
.<CompanyVesselRelationshipsDto, CompanyVesselRelationshipsEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<CompanyVesselRelationshipsDto, CompanyVesselRelationshipsEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(companyVesselRelationshipsReader(businessDataSource, tableMetaInfo))
.processor(new CompanyVesselRelationshipsProcessor())
.writer(new CompanyVesselRelationshipsWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<CompanyVesselRelationshipsDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(companyVesselRelationshipsWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -732,12 +689,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step crewListSyncStep() {
log.info("Step 생성: crewListSyncStep");
return new StepBuilder("crewListSyncStep", jobRepository)
.<CrewListDto, CrewListEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<CrewListDto, CrewListEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(crewListReader(businessDataSource, tableMetaInfo))
.processor(new CrewListProcessor())
.writer(new CrewListWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<CrewListDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(crewListWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -746,12 +701,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step darkActivityConfirmedSyncStep() {
log.info("Step 생성: darkActivityConfirmedSyncStep");
return new StepBuilder("darkActivityConfirmedSyncStep", jobRepository)
.<DarkActivityConfirmedDto, DarkActivityConfirmedEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<DarkActivityConfirmedDto, DarkActivityConfirmedEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(darkActivityConfirmedReader(businessDataSource, tableMetaInfo))
.processor(new DarkActivityConfirmedProcessor())
.writer(new DarkActivityConfirmedWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<DarkActivityConfirmedDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(darkActivityConfirmedWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -760,12 +713,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step flagHistorySyncStep() {
log.info("Step 생성: flagHistorySyncStep");
return new StepBuilder("flagHistorySyncStep", jobRepository)
.<FlagHistoryDto, FlagHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<FlagHistoryDto, FlagHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(flagHistoryReader(businessDataSource, tableMetaInfo))
.processor(new FlagHistoryProcessor())
.writer(new FlagHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<FlagHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(flagHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -774,12 +725,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step groupBeneficialOwnerHistorySyncStep() {
log.info("Step 생성: groupBeneficialOwnerHistorySyncStep");
return new StepBuilder("groupBeneficialOwnerHistorySyncStep", jobRepository)
.<GroupBeneficialOwnerHistoryDto, GroupBeneficialOwnerHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<GroupBeneficialOwnerHistoryDto, GroupBeneficialOwnerHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(groupBeneficialOwnerHistoryReader(businessDataSource, tableMetaInfo))
.processor(new GroupBeneficialOwnerHistoryProcessor())
.writer(new GroupBeneficialOwnerHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<GroupBeneficialOwnerHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(groupBeneficialOwnerHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -788,12 +737,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step iceClassSyncStep() {
log.info("Step 생성: iceClassSyncStep");
return new StepBuilder("iceClassSyncStep", jobRepository)
.<IceClassDto, IceClassEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<IceClassDto, IceClassEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(iceClassReader(businessDataSource, tableMetaInfo))
.processor(new IceClassProcessor())
.writer(new IceClassWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<IceClassDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(iceClassWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -802,12 +749,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step nameHistorySyncStep() {
log.info("Step 생성: nameHistorySyncStep");
return new StepBuilder("nameHistorySyncStep", jobRepository)
.<NameHistoryDto, NameHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<NameHistoryDto, NameHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(nameHistoryReader(businessDataSource, tableMetaInfo))
.processor(new NameHistoryProcessor())
.writer(new NameHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<NameHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(nameHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -816,12 +761,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step operatorHistorySyncStep() {
log.info("Step 생성: operatorHistorySyncStep");
return new StepBuilder("operatorHistorySyncStep", jobRepository)
.<OperatorHistoryDto, OperatorHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<OperatorHistoryDto, OperatorHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(operatorHistoryReader(businessDataSource, tableMetaInfo))
.processor(new OperatorHistoryProcessor())
.writer(new OperatorHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<OperatorHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(operatorHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -830,12 +773,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step pandIHistorySyncStep() {
log.info("Step 생성: pandIHistorySyncStep");
return new StepBuilder("pandIHistorySyncStep", jobRepository)
.<PandIHistoryDto, PandIHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<PandIHistoryDto, PandIHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(pandIHistoryReader(businessDataSource, tableMetaInfo))
.processor(new PandIHistoryProcessor())
.writer(new PandIHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<PandIHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(pandIHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -844,12 +785,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step safetyManagementCertificateHistSyncStep() {
log.info("Step 생성: safetyManagementCertificateHistSyncStep");
return new StepBuilder("safetyManagementCertificateHistSyncStep", jobRepository)
.<SafetyManagementCertificateHistDto, SafetyManagementCertificateHistEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<SafetyManagementCertificateHistDto, SafetyManagementCertificateHistEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(safetyManagementCertificateHistReader(businessDataSource, tableMetaInfo))
.processor(new SafetyManagementCertificateHistProcessor())
.writer(new SafetyManagementCertificateHistWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<SafetyManagementCertificateHistDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(safetyManagementCertificateHistWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -858,12 +797,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step shipManagerHistorySyncStep() {
log.info("Step 생성: shipManagerHistorySyncStep");
return new StepBuilder("shipManagerHistorySyncStep", jobRepository)
.<ShipManagerHistoryDto, ShipManagerHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<ShipManagerHistoryDto, ShipManagerHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(shipManagerHistoryReader(businessDataSource, tableMetaInfo))
.processor(new ShipManagerHistoryProcessor())
.writer(new ShipManagerHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<ShipManagerHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(shipManagerHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -872,12 +809,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step sisterShipLinksSyncStep() {
log.info("Step 생성: sisterShipLinksSyncStep");
return new StepBuilder("sisterShipLinksSyncStep", jobRepository)
.<SisterShipLinksDto, SisterShipLinksEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<SisterShipLinksDto, SisterShipLinksEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(sisterShipLinksReader(businessDataSource, tableMetaInfo))
.processor(new SisterShipLinksProcessor())
.writer(new SisterShipLinksWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<SisterShipLinksDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(sisterShipLinksWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -886,12 +821,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step specialFeatureSyncStep() {
log.info("Step 생성: specialFeatureSyncStep");
return new StepBuilder("specialFeatureSyncStep", jobRepository)
.<SpecialFeatureDto, SpecialFeatureEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<SpecialFeatureDto, SpecialFeatureEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(specialFeatureReader(businessDataSource, tableMetaInfo))
.processor(new SpecialFeatureProcessor())
.writer(new SpecialFeatureWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<SpecialFeatureDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(specialFeatureWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -900,12 +833,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step statusHistorySyncStep() {
log.info("Step 생성: statusHistorySyncStep");
return new StepBuilder("statusHistorySyncStep", jobRepository)
.<StatusHistoryDto, StatusHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<StatusHistoryDto, StatusHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(statusHistoryReader(businessDataSource, tableMetaInfo))
.processor(new StatusHistoryProcessor())
.writer(new StatusHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<StatusHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(statusHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -914,12 +845,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step stowageCommoditySyncStep() {
log.info("Step 생성: stowageCommoditySyncStep");
return new StepBuilder("stowageCommoditySyncStep", jobRepository)
.<StowageCommodityDto, StowageCommodityEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<StowageCommodityDto, StowageCommodityEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(stowageCommodityReader(businessDataSource, tableMetaInfo))
.processor(new StowageCommodityProcessor())
.writer(new StowageCommodityWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<StowageCommodityDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(stowageCommodityWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -928,12 +857,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step surveyDatesSyncStep() {
log.info("Step 생성: surveyDatesSyncStep");
return new StepBuilder("surveyDatesSyncStep", jobRepository)
.<SurveyDatesDto, SurveyDatesEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<SurveyDatesDto, SurveyDatesEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(surveyDatesReader(businessDataSource, tableMetaInfo))
.processor(new SurveyDatesProcessor())
.writer(new SurveyDatesWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<SurveyDatesDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(surveyDatesWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -942,12 +869,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step surveyDatesHistoryUniqueSyncStep() {
log.info("Step 생성: surveyDatesHistoryUniqueSyncStep");
return new StepBuilder("surveyDatesHistoryUniqueSyncStep", jobRepository)
.<SurveyDatesHistoryUniqueDto, SurveyDatesHistoryUniqueEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<SurveyDatesHistoryUniqueDto, SurveyDatesHistoryUniqueEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(surveyDatesHistoryUniqueReader(businessDataSource, tableMetaInfo))
.processor(new SurveyDatesHistoryUniqueProcessor())
.writer(new SurveyDatesHistoryUniqueWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<SurveyDatesHistoryUniqueDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(surveyDatesHistoryUniqueWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -956,12 +881,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step technicalManagerHistorySyncStep() {
log.info("Step 생성: technicalManagerHistorySyncStep");
return new StepBuilder("technicalManagerHistorySyncStep", jobRepository)
.<TechnicalManagerHistoryDto, TechnicalManagerHistoryEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<TechnicalManagerHistoryDto, TechnicalManagerHistoryEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(technicalManagerHistoryReader(businessDataSource, tableMetaInfo))
.processor(new TechnicalManagerHistoryProcessor())
.writer(new TechnicalManagerHistoryWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<TechnicalManagerHistoryDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(technicalManagerHistoryWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -970,12 +893,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step thrustersSyncStep() {
log.info("Step 생성: thrustersSyncStep");
return new StepBuilder("thrustersSyncStep", jobRepository)
.<ThrustersDto, ThrustersEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<ThrustersDto, ThrustersEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(thrustersReader(businessDataSource, tableMetaInfo))
.processor(new ThrustersProcessor())
.writer(new ThrustersWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<ThrustersDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(thrustersWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}
@ -984,12 +905,10 @@ public class ShipDetailSyncJobConfig extends BaseJobConfig<ShipInfoMstDto, ShipI
public Step tbCompanyDetailSyncStep() {
log.info("Step 생성: tbCompanyDetailSyncStep");
return new StepBuilder("tbCompanyDetailSyncStep", jobRepository)
.<TbCompanyDetailDto, TbCompanyDetailEntity>chunk(new GroupByExecutionIdPolicy(), transactionManager)
.<TbCompanyDetailDto, TbCompanyDetailEntity>chunk(Integer.MAX_VALUE, transactionManager)
.reader(tbCompanyDetailReader(businessDataSource, tableMetaInfo))
.processor(new TbCompanyDetailProcessor())
.writer(new TbCompanyDetailWriter(shipRepository, transactionManager, subChunkSize))
.listener(new GroupByExecutionIdReadListener<TbCompanyDetailDto>()) // Reader 리스너
.listener(new GroupByExecutionIdChunkListener()) // Chunk 리스너
.listener(tbCompanyDetailWriteListener()) // Write 완료 batch_flag 업데이트
.build();
}

파일 보기

@ -1,73 +1,37 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.BareboatCharterHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class BareboatCharterHistoryReader implements ItemReader<BareboatCharterHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<BareboatCharterHistoryDto> allDataBuffer = new ArrayList<>();
public class BareboatCharterHistoryReader extends BaseSyncReader<BareboatCharterHistoryDto> {
public BareboatCharterHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public BareboatCharterHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceBareboatCharterHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceBareboatCharterHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[BareboatCharterHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceBareboatCharterHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return BareboatCharterHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.bbctrSeq(rs.getString("bbctr_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.bbctrCompanyCd(rs.getString("bbctr_company_cd"))
.bbctrCompany(rs.getString("bbctr_company"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceBareboatCharterHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected BareboatCharterHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return BareboatCharterHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.bbctrSeq(rs.getString("bbctr_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.bbctrCompanyCd(rs.getString("bbctr_company_cd"))
.bbctrCompany(rs.getString("bbctr_company"))
.build();
}
}

파일 보기

@ -1,73 +1,37 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.CallsignAndMmsiHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class CallsignAndMmsiHistoryReader implements ItemReader<CallsignAndMmsiHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<CallsignAndMmsiHistoryDto> allDataBuffer = new ArrayList<>();
public class CallsignAndMmsiHistoryReader extends BaseSyncReader<CallsignAndMmsiHistoryDto> {
public CallsignAndMmsiHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public CallsignAndMmsiHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceCallsignAndMmsiHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceCallsignAndMmsiHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[CallsignAndMmsiHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceCallsignAndMmsiHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return CallsignAndMmsiHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipIdntfSeq(rs.getString("ship_idntf_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.clsgnNo(rs.getString("clsgn_no"))
.mmsiNo(rs.getString("mmsi_no"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceCallsignAndMmsiHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected CallsignAndMmsiHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return CallsignAndMmsiHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipIdntfSeq(rs.getString("ship_idntf_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.clsgnNo(rs.getString("clsgn_no"))
.mmsiNo(rs.getString("mmsi_no"))
.build();
}
}

파일 보기

@ -1,76 +1,40 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.ClassHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class ClassHistoryReader implements ItemReader<ClassHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<ClassHistoryDto> allDataBuffer = new ArrayList<>();
public class ClassHistoryReader extends BaseSyncReader<ClassHistoryDto> {
public ClassHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public ClassHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceClassHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceClassHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[ClassHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceClassHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return ClassHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.clficHstrySeq(rs.getString("clfic_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.clficCd(rs.getString("clfic_cd"))
.clficId(rs.getString("clfic_id"))
.clficAstnNm(rs.getString("clfic_asctn_nm"))
.clficHasYn(rs.getString("clfic_has_yn"))
.nowYn(rs.getString("now_yn"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceClassHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected ClassHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return ClassHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.clficHstrySeq(rs.getString("clfic_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.clficCd(rs.getString("clfic_cd"))
.clficId(rs.getString("clfic_id"))
.clficAstnNm(rs.getString("clfic_asctn_nm"))
.clficHasYn(rs.getString("clfic_has_yn"))
.nowYn(rs.getString("now_yn"))
.build();
}
}

파일 보기

@ -1,89 +1,53 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.CompanyVesselRelationshipsDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class CompanyVesselRelationshipsReader implements ItemReader<CompanyVesselRelationshipsDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<CompanyVesselRelationshipsDto> allDataBuffer = new ArrayList<>();
public class CompanyVesselRelationshipsReader extends BaseSyncReader<CompanyVesselRelationshipsDto> {
public CompanyVesselRelationshipsReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public CompanyVesselRelationshipsDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceCompanyVesselRelationships;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceCompanyVesselRelationships), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[CompanyVesselRelationshipsReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceCompanyVesselRelationships);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return CompanyVesselRelationshipsDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.doccHasCompanyCd(rs.getString("docc_has_company_cd"))
.doccHasCompany(rs.getString("docc_has_company"))
.groupActlOwnr(rs.getString("group_actl_ownr"))
.groupActlOwnrCd(rs.getString("group_actl_ownr_cd"))
.shipOperator(rs.getString("ship_operator"))
.shipOperatorCd(rs.getString("ship_operator_cd"))
.rgOwnr(rs.getString("rg_ownr"))
.rgOwnrCd(rs.getString("rg_ownr_cd"))
.shipMngCompany(rs.getString("ship_mng_company"))
.shipMngCompanyCd(rs.getString("ship_mng_company_cd"))
.techMngCompany(rs.getString("tech_mng_company"))
.techMngCompanyCd(rs.getString("tech_mng_company_cd"))
.doccGroup(rs.getString("docc_group"))
.doccGroupCd(rs.getString("docc_group_cd"))
.shipOperatorGroup(rs.getString("ship_operator_group"))
.shipOperatorGroupCd(rs.getString("ship_operator_group_cd"))
.shipMngCompanyGroup(rs.getString("ship_mng_company_group"))
.shipMngCompanyGroupCd(rs.getString("ship_mng_company_group_cd"))
.techMngCompanyGroup(rs.getString("tech_mng_company_group"))
.techMngCompanyGroupCd(rs.getString("tech_mng_company_group_cd"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceCompanyVesselRelationships);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected CompanyVesselRelationshipsDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return CompanyVesselRelationshipsDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.doccHasCompanyCd(rs.getString("docc_has_company_cd"))
.doccHasCompany(rs.getString("docc_has_company"))
.groupActlOwnr(rs.getString("group_actl_ownr"))
.groupActlOwnrCd(rs.getString("group_actl_ownr_cd"))
.shipOperator(rs.getString("ship_operator"))
.shipOperatorCd(rs.getString("ship_operator_cd"))
.rgOwnr(rs.getString("rg_ownr"))
.rgOwnrCd(rs.getString("rg_ownr_cd"))
.shipMngCompany(rs.getString("ship_mng_company"))
.shipMngCompanyCd(rs.getString("ship_mng_company_cd"))
.techMngCompany(rs.getString("tech_mng_company"))
.techMngCompanyCd(rs.getString("tech_mng_company_cd"))
.doccGroup(rs.getString("docc_group"))
.doccGroupCd(rs.getString("docc_group_cd"))
.shipOperatorGroup(rs.getString("ship_operator_group"))
.shipOperatorGroupCd(rs.getString("ship_operator_group_cd"))
.shipMngCompanyGroup(rs.getString("ship_mng_company_group"))
.shipMngCompanyGroupCd(rs.getString("ship_mng_company_group_cd"))
.techMngCompanyGroup(rs.getString("tech_mng_company_group"))
.techMngCompanyGroupCd(rs.getString("tech_mng_company_group_cd"))
.build();
}
}

파일 보기

@ -1,80 +1,44 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.CrewListDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class CrewListReader implements ItemReader<CrewListDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<CrewListDto> allDataBuffer = new ArrayList<>();
public class CrewListReader extends BaseSyncReader<CrewListDto> {
public CrewListReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public CrewListDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceCrewList;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceCrewList), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[CrewListReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceCrewList);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return CrewListDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.crewId(rs.getString("crew_id"))
.shipNm(rs.getString("ship_nm"))
.ntnlty(rs.getString("ntnlty"))
.crewRstrYmd(rs.getString("crew_rstr_ymd"))
.oaCrewCnt(rs.getBigDecimal("oa_crew_cnt"))
.genCrewCnt(rs.getBigDecimal("gen_crew_cnt"))
.offcrCnt(rs.getBigDecimal("offcr_cnt"))
.apprOffcrCnt(rs.getBigDecimal("appr_offcr_cnt"))
.trneCnt(rs.getBigDecimal("trne_cnt"))
.embrkMntncCrewCnt(rs.getBigDecimal("embrk_mntnc_crew_cnt"))
.unrprtCnt(rs.getBigDecimal("unrprt_cnt"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceCrewList);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected CrewListDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return CrewListDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.crewId(rs.getString("crew_id"))
.shipNm(rs.getString("ship_nm"))
.ntnlty(rs.getString("ntnlty"))
.crewRstrYmd(rs.getString("crew_rstr_ymd"))
.oaCrewCnt(rs.getBigDecimal("oa_crew_cnt"))
.genCrewCnt(rs.getBigDecimal("gen_crew_cnt"))
.offcrCnt(rs.getBigDecimal("offcr_cnt"))
.apprOffcrCnt(rs.getBigDecimal("appr_offcr_cnt"))
.trneCnt(rs.getBigDecimal("trne_cnt"))
.embrkMntncCrewCnt(rs.getBigDecimal("embrk_mntnc_crew_cnt"))
.unrprtCnt(rs.getBigDecimal("unrprt_cnt"))
.build();
}
}

파일 보기

@ -1,94 +1,58 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.DarkActivityConfirmedDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class DarkActivityConfirmedReader implements ItemReader<DarkActivityConfirmedDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<DarkActivityConfirmedDto> allDataBuffer = new ArrayList<>();
public class DarkActivityConfirmedReader extends BaseSyncReader<DarkActivityConfirmedDto> {
public DarkActivityConfirmedReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public DarkActivityConfirmedDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceDarkActivityConfirmed;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceDarkActivityConfirmed), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[DarkActivityConfirmedReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceDarkActivityConfirmed);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return DarkActivityConfirmedDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.mmsiNo(rs.getString("mmsi_no"))
.darkHr(rs.getObject("dark_hr", Long.class))
.darkActvStatus(rs.getObject("dark_actv_status", Long.class))
.shipNm(rs.getString("ship_nm"))
.darkActv(rs.getString("dark_actv"))
.zoneId(rs.getObject("zone_id", Long.class))
.zoneNm(rs.getString("zone_nm"))
.zoneCountry(rs.getString("zone_country"))
.darkTmUtc(rs.getTimestamp("dark_tm_utc") != null ? rs.getTimestamp("dark_tm_utc").toLocalDateTime() : null)
.darkLat(rs.getObject("dark_lat", Double.class))
.darkLon(rs.getObject("dark_lon", Double.class))
.darkSpd(rs.getObject("dark_spd", Double.class))
.darkHeading(rs.getObject("dark_heading", Double.class))
.darkDraft(rs.getObject("dark_draft", Double.class))
.nxtCptrTmUtc(rs.getTimestamp("nxt_cptr_tm_utc") != null ? rs.getTimestamp("nxt_cptr_tm_utc").toLocalDateTime() : null)
.nxtCptrSpd(rs.getObject("nxt_cptr_spd", Double.class))
.nxtCptrDraft(rs.getObject("nxt_cptr_draft", Double.class))
.nxtCptrHeading(rs.getObject("nxt_cptr_heading", Double.class))
.darkRptDestAis(rs.getString("dark_rpt_dest_ais"))
.lastPrtcllPort(rs.getString("last_prtcll_port"))
.lastPoccntryCd(rs.getString("last_poccntry_cd"))
.lastPoccntry(rs.getString("last_poccntry"))
.nxtCptrLat(rs.getObject("nxt_cptr_lat", Double.class))
.nxtCptrLon(rs.getObject("nxt_cptr_lon", Double.class))
.nxtCptrRptDestAis(rs.getString("nxt_cptr_rpt_dest_ais"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceDarkActivityConfirmed);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected DarkActivityConfirmedDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return DarkActivityConfirmedDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.mmsiNo(rs.getString("mmsi_no"))
.darkHr(rs.getObject("dark_hr", Long.class))
.darkActvStatus(rs.getObject("dark_actv_status", Long.class))
.shipNm(rs.getString("ship_nm"))
.darkActv(rs.getString("dark_actv"))
.zoneId(rs.getObject("zone_id", Long.class))
.zoneNm(rs.getString("zone_nm"))
.zoneCountry(rs.getString("zone_country"))
.darkTmUtc(rs.getTimestamp("dark_tm_utc") != null ? rs.getTimestamp("dark_tm_utc").toLocalDateTime() : null)
.darkLat(rs.getObject("dark_lat", Double.class))
.darkLon(rs.getObject("dark_lon", Double.class))
.darkSpd(rs.getObject("dark_spd", Double.class))
.darkHeading(rs.getObject("dark_heading", Double.class))
.darkDraft(rs.getObject("dark_draft", Double.class))
.nxtCptrTmUtc(rs.getTimestamp("nxt_cptr_tm_utc") != null ? rs.getTimestamp("nxt_cptr_tm_utc").toLocalDateTime() : null)
.nxtCptrSpd(rs.getObject("nxt_cptr_spd", Double.class))
.nxtCptrDraft(rs.getObject("nxt_cptr_draft", Double.class))
.nxtCptrHeading(rs.getObject("nxt_cptr_heading", Double.class))
.darkRptDestAis(rs.getString("dark_rpt_dest_ais"))
.lastPrtcllPort(rs.getString("last_prtcll_port"))
.lastPoccntryCd(rs.getString("last_poccntry_cd"))
.lastPoccntry(rs.getString("last_poccntry"))
.nxtCptrLat(rs.getObject("nxt_cptr_lat", Double.class))
.nxtCptrLon(rs.getObject("nxt_cptr_lon", Double.class))
.nxtCptrRptDestAis(rs.getString("nxt_cptr_rpt_dest_ais"))
.build();
}
}

파일 보기

@ -1,73 +1,37 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.FlagHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class FlagHistoryReader implements ItemReader<FlagHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<FlagHistoryDto> allDataBuffer = new ArrayList<>();
public class FlagHistoryReader extends BaseSyncReader<FlagHistoryDto> {
public FlagHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public FlagHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceFlagHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceFlagHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[FlagHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceFlagHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return FlagHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipCountryHstrySeq(rs.getString("ship_country_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.countryCd(rs.getString("country_cd"))
.country(rs.getString("country"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceFlagHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected FlagHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return FlagHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipCountryHstrySeq(rs.getString("ship_country_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.countryCd(rs.getString("country_cd"))
.country(rs.getString("country"))
.build();
}
}

파일 보기

@ -1,74 +1,38 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.GroupBeneficialOwnerHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class GroupBeneficialOwnerHistoryReader implements ItemReader<GroupBeneficialOwnerHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<GroupBeneficialOwnerHistoryDto> allDataBuffer = new ArrayList<>();
public class GroupBeneficialOwnerHistoryReader extends BaseSyncReader<GroupBeneficialOwnerHistoryDto> {
public GroupBeneficialOwnerHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public GroupBeneficialOwnerHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceGroupBeneficialOwnerHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceGroupBeneficialOwnerHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[GroupBeneficialOwnerHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceGroupBeneficialOwnerHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return GroupBeneficialOwnerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipGroupRevnOwnrHstrySeq(rs.getString("ship_group_revn_ownr_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.groupActlOwnrCd(rs.getString("group_actl_ownr_cd"))
.groupActlOwnr(rs.getString("group_actl_ownr"))
.companyStatus(rs.getString("company_status"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceGroupBeneficialOwnerHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected GroupBeneficialOwnerHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return GroupBeneficialOwnerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipGroupRevnOwnrHstrySeq(rs.getString("ship_group_revn_ownr_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.groupActlOwnrCd(rs.getString("group_actl_ownr_cd"))
.groupActlOwnr(rs.getString("group_actl_ownr"))
.companyStatus(rs.getString("company_status"))
.build();
}
}

파일 보기

@ -1,71 +1,35 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.IceClassDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class IceClassReader implements ItemReader<IceClassDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<IceClassDto> allDataBuffer = new ArrayList<>();
public class IceClassReader extends BaseSyncReader<IceClassDto> {
public IceClassReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public IceClassDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceIceClass;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceIceClass), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[IceClassReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceIceClass);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return IceClassDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.iceGrdCd(rs.getString("ice_grd_cd"))
.iceGrd(rs.getString("ice_grd"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceIceClass);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected IceClassDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return IceClassDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.iceGrdCd(rs.getString("ice_grd_cd"))
.iceGrd(rs.getString("ice_grd"))
.build();
}
}

파일 보기

@ -1,72 +1,36 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.NameHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class NameHistoryReader implements ItemReader<NameHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<NameHistoryDto> allDataBuffer = new ArrayList<>();
public class NameHistoryReader extends BaseSyncReader<NameHistoryDto> {
public NameHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public NameHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceNameHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceNameHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[NameHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceNameHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return NameHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipNmChgHstrySeq(rs.getString("ship_nm_chg_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.shipNm(rs.getString("ship_nm"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceNameHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected NameHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return NameHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipNmChgHstrySeq(rs.getString("ship_nm_chg_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.shipNm(rs.getString("ship_nm"))
.build();
}
}

파일 보기

@ -1,74 +1,38 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.OperatorHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class OperatorHistoryReader implements ItemReader<OperatorHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<OperatorHistoryDto> allDataBuffer = new ArrayList<>();
public class OperatorHistoryReader extends BaseSyncReader<OperatorHistoryDto> {
public OperatorHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public OperatorHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceOperatorHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceOperatorHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[OperatorHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceOperatorHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return OperatorHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipOperatorHstrySeq(rs.getString("ship_operator_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.shipOperatorCd(rs.getString("ship_operator_cd"))
.shipOperator(rs.getString("ship_operator"))
.companyStatus(rs.getString("company_status"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceOperatorHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected OperatorHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return OperatorHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipOperatorHstrySeq(rs.getString("ship_operator_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.shipOperatorCd(rs.getString("ship_operator_cd"))
.shipOperator(rs.getString("ship_operator"))
.companyStatus(rs.getString("company_status"))
.build();
}
}

파일 보기

@ -1,74 +1,38 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.OwnerHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class OwnerHistoryReader implements ItemReader<OwnerHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<OwnerHistoryDto> allDataBuffer = new ArrayList<>();
public class OwnerHistoryReader extends BaseSyncReader<OwnerHistoryDto> {
public OwnerHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public OwnerHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceOwnerHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceOwnerHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[OwnerHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceOwnerHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return OwnerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipOwnrHstrySeq(rs.getString("ship_ownr_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.ownrCd(rs.getString("ownr_cd"))
.ownr(rs.getString("ownr"))
.companyStatus(rs.getString("company_status"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceOwnerHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected OwnerHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return OwnerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipOwnrHstrySeq(rs.getString("ship_ownr_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.ownrCd(rs.getString("ownr_cd"))
.ownr(rs.getString("ownr"))
.companyStatus(rs.getString("company_status"))
.build();
}
}

파일 보기

@ -1,74 +1,38 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.PandIHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class PandIHistoryReader implements ItemReader<PandIHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<PandIHistoryDto> allDataBuffer = new ArrayList<>();
public class PandIHistoryReader extends BaseSyncReader<PandIHistoryDto> {
public PandIHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public PandIHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourcePandiHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourcePandiHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[PandIHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourcePandiHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return PandIHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipPrtcRpnHstrySeq(rs.getString("ship_prtc_rpn_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.pniClubCd(rs.getString("pni_club_cd"))
.pniClubNm(rs.getString("pni_club_nm"))
.src(rs.getString("src"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourcePandiHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected PandIHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return PandIHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipPrtcRpnHstrySeq(rs.getString("ship_prtc_rpn_hstry_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.pniClubCd(rs.getString("pni_club_cd"))
.pniClubNm(rs.getString("pni_club_nm"))
.src(rs.getString("src"))
.build();
}
}

파일 보기

@ -1,82 +1,46 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.SafetyManagementCertificateHistDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class SafetyManagementCertificateHistReader implements ItemReader<SafetyManagementCertificateHistDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<SafetyManagementCertificateHistDto> allDataBuffer = new ArrayList<>();
public class SafetyManagementCertificateHistReader extends BaseSyncReader<SafetyManagementCertificateHistDto> {
public SafetyManagementCertificateHistReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public SafetyManagementCertificateHistDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceSafetyManagementCertificateHist;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceSafetyManagementCertificateHist), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[SafetyManagementCertificateHistReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceSafetyManagementCertificateHist);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return SafetyManagementCertificateHistDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipSftyMngEvdcSeq(rs.getString("ship_sfty_mng_evdc_seq"))
.smgrcSrngEngines(rs.getString("smgrc_srng_engines"))
.smgrcSysCatConvArbt(rs.getString("smgrc_sys_cat_conv_arbt"))
.smgrcExpryDay(rs.getString("smgrc_expry_day"))
.smgrcIssueDay(rs.getString("smgrc_issue_day"))
.smgrcDoccCompany(rs.getString("smgrc_docc_company"))
.smgrcNtnlty(rs.getString("smgrc_ntnlty"))
.smgrcIssueEngines(rs.getString("smgrc_issue_engines"))
.smgrcEtcDesc(rs.getString("smgrc_etc_desc"))
.smgrcShipNm(rs.getString("smgrc_ship_nm"))
.smgrcShipType(rs.getString("smgrc_ship_type"))
.smgrcSrc(rs.getString("smgrc_src"))
.smgrcCompanyCd(rs.getString("smgrc_company_cd"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceSafetyManagementCertificateHist);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected SafetyManagementCertificateHistDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return SafetyManagementCertificateHistDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipSftyMngEvdcSeq(rs.getString("ship_sfty_mng_evdc_seq"))
.smgrcSrngEngines(rs.getString("smgrc_srng_engines"))
.smgrcSysCatConvArbt(rs.getString("smgrc_sys_cat_conv_arbt"))
.smgrcExpryDay(rs.getString("smgrc_expry_day"))
.smgrcIssueDay(rs.getString("smgrc_issue_day"))
.smgrcDoccCompany(rs.getString("smgrc_docc_company"))
.smgrcNtnlty(rs.getString("smgrc_ntnlty"))
.smgrcIssueEngines(rs.getString("smgrc_issue_engines"))
.smgrcEtcDesc(rs.getString("smgrc_etc_desc"))
.smgrcShipNm(rs.getString("smgrc_ship_nm"))
.smgrcShipType(rs.getString("smgrc_ship_type"))
.smgrcSrc(rs.getString("smgrc_src"))
.smgrcCompanyCd(rs.getString("smgrc_company_cd"))
.build();
}
}

파일 보기

@ -1,84 +1,44 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.ShipAddInfoDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class ShipAddInfoReader implements ItemReader<ShipAddInfoDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<ShipAddInfoDto> allDataBuffer = new ArrayList<>();
public class ShipAddInfoReader extends BaseSyncReader<ShipAddInfoDto> {
public ShipAddInfoReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public ShipAddInfoDto read() throws Exception {
// 1. 버퍼가 비어있을 때만 DB에서 "다음 처리 대상 ID 하나" 데이터를 긁어옵니다.
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null; // 진짜 데이터가 없으면 종료
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceAdditionalShipsData;
}
private void fetchNextGroup() {
// 1. 아직 'N' 최소 ID 하나를 찾음
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceAdditionalShipsData), Long.class);
} catch (Exception e) {
return; // 대상 없음
}
if (nextTargetId != null) {
log.info("[ShipAddInfoReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
// 2. 해당 ID의 데이터만 버퍼에 로드
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceAdditionalShipsData);
final Long targetId = nextTargetId; // lambda 내부에서 사용하기 위해 final 변수로
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return ShipAddInfoDto.builder()
.jobExecutionId(targetId) // job_execution_id 설정
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipEml(rs.getString("ship_eml"))
.maxDpwt(rs.getString("max_dpwt"))
.maxDrillDepth(rs.getString("max_drill_depth"))
.drillBrg(rs.getString("drill_brg"))
.oceanProdFacility(rs.getString("ocean_prod_facility"))
.deckHeatExch(rs.getString("deck_heat_exch"))
.dehtexMatral(rs.getString("dehtex_matral"))
.portblTwinDeck(rs.getString("portbl_twin_deck"))
.fixedTwinDeck(rs.getString("fixed_twin_deck"))
.shipSatlitCommId(rs.getString("ship_satlit_comm_id"))
.shipSatlitCmrspCd(rs.getString("ship_satlit_cmrsp_cd"))
.build();
}, nextTargetId);
// 3. 해당 ID 'P' 변경
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceAdditionalShipsData);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected ShipAddInfoDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return ShipAddInfoDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipEml(rs.getString("ship_eml"))
.maxDpwt(rs.getString("max_dpwt"))
.maxDrillDepth(rs.getString("max_drill_depth"))
.drillBrg(rs.getString("drill_brg"))
.oceanProdFacility(rs.getString("ocean_prod_facility"))
.deckHeatExch(rs.getString("deck_heat_exch"))
.dehtexMatral(rs.getString("dehtex_matral"))
.portblTwinDeck(rs.getString("portbl_twin_deck"))
.fixedTwinDeck(rs.getString("fixed_twin_deck"))
.shipSatlitCommId(rs.getString("ship_satlit_comm_id"))
.shipSatlitCmrspCd(rs.getString("ship_satlit_cmrsp_cd"))
.build();
}
}

파일 보기

@ -1,156 +1,115 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.ShipInfoMstDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class ShipDataReader implements ItemReader<ShipInfoMstDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<ShipInfoMstDto> allDataBuffer = new ArrayList<>();
public class ShipDataReader extends BaseSyncReader<ShipInfoMstDto> {
public ShipDataReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public ShipInfoMstDto read() throws Exception {
// 1. 버퍼가 비어있을 때만 DB에서 "다음 처리 대상 ID 하나" 데이터를 긁어옵니다.
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null; // 진짜 데이터가 없으면 종료
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceShipDetailData;
}
private void fetchNextGroup() {
// 1. 아직 'N' 최소 ID 하나를 찾음
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(CommonSql.getNextTargetQuery(tableMetaInfo.sourceShipDetailData), Long.class);
} catch (Exception e) {
return; // 대상 없음
}
if (nextTargetId != null) {
log.info("[ShipDataReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
// 2. 해당 ID의 데이터만 버퍼에 로드
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceShipDetailData);
final Long targetId = nextTargetId; // lambda 내부에서 사용하기 위해 final 변수로
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return ShipInfoMstDto.builder()
.jobExecutionId(targetId) // job_execution_id 설정
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.mmsiNo(rs.getString("mmsi_no"))
.shipNm(rs.getString("ship_nm"))
.clsgnNo(rs.getString("clsgn_no"))
.frmlaRegNo(rs.getString("frmla_reg_no"))
.fshrPrmtNo(rs.getString("fshr_prmt_no"))
.shipNtnlty(rs.getString("ship_ntnlty"))
.ntnltyCd(rs.getString("ntnlty_cd"))
.loadPort(rs.getString("load_port"))
.clfic(rs.getString("clfic"))
.clficDesc(rs.getString("clfic_desc"))
.shipStatus(rs.getString("ship_status"))
.shipTypeGroup(rs.getString("ship_type_group"))
.shipTypeLvTwo(rs.getString("ship_type_lv_two"))
.shipTypeLvThr(rs.getString("ship_type_lv_thr"))
.shipTypeLvFour(rs.getString("ship_type_lv_four"))
.shipTypeLvFive(rs.getString("ship_type_lv_five"))
.shipTypeLvFiveDtldType(rs.getString("ship_type_lv_five_dtld_type"))
.shipTypeLvFiveHullType(rs.getString("ship_type_lv_five_hull_type"))
.shipTypeLvFiveLwrnkGroup(rs.getString("ship_type_lv_five_lwrnk_group"))
.buildYy(rs.getString("build_yy"))
.buildYmd(rs.getString("build_ymd"))
.shpyrd(rs.getString("shpyrd"))
.shpyrdOffclNm(rs.getString("shpyrd_offcl_nm"))
.shpyrdBuildNo(rs.getString("shpyrd_build_no"))
.buildDesc(rs.getString("build_desc"))
.modfHstryDesc(rs.getString("modf_hstry_desc"))
.whlnthLoa(rs.getString("whlnth_loa"))
.regLength(rs.getString("reg_length"))
.lbp(rs.getString("lbp"))
.formnBreadth(rs.getString("formn_breadth"))
.maxBreadth(rs.getString("max_breadth"))
.depth(rs.getString("depth"))
.draft(rs.getString("draft"))
.keelMastHg(rs.getString("keel_mast_hg"))
.bulbBow(rs.getString("bulb_bow"))
.gt(rs.getString("gt"))
.ntTon(rs.getString("nt_ton"))
.dwt(rs.getString("dwt"))
.displacement(rs.getString("displacement"))
.lightDisplacementTon(rs.getString("light_displacement_ton"))
.cgt(rs.getString("cgt"))
.fldngOneCmPerTonTpci(rs.getString("fldng_one_cm_per_ton_tpci"))
.tonEfectDay(rs.getString("ton_efect_day"))
.calcfrmDwt(rs.getString("calcfrm_dwt"))
.teuCnt(rs.getString("teu_cnt"))
.teuCapacity(rs.getString("teu_capacity"))
.grainCapacityM3(rs.getString("grain_capacity_m3"))
.baleCapacity(rs.getString("bale_capacity"))
.liquidCapacity(rs.getString("liquid_capacity"))
.gasM3(rs.getString("gas_m3"))
.insulatedM3(rs.getString("insulated_m3"))
.passengerCapacity(rs.getString("passenger_capacity"))
.bollardPull(rs.getString("bollard_pull"))
.svcSpd(rs.getString("svc_spd"))
.mainEngineType(rs.getString("main_engine_type"))
.fuelCnsmpSpdOne(rs.getString("fuel_cnsmp_spd_one"))
.fuelCnsmpamtValOne(rs.getString("fuel_cnsmpamt_val_one"))
.fuelCnsmpSpdTwo(rs.getString("fuel_cnsmp_spd_two"))
.fuelCnsmpamtValTwo(rs.getString("fuel_cnsmpamt_val_two"))
.totalFuelCapacityM3(rs.getString("total_fuel_capacity_m3"))
.blrMftr(rs.getString("blr_mftr"))
.proplrMftr(rs.getString("proplr_mftr"))
.cargoCapacityM3Desc(rs.getString("cargo_capacity_m3_desc"))
.eqpmntDesc(rs.getString("eqpmnt_desc"))
.hdn(rs.getString("hdn"))
.hatcheDesc(rs.getString("hatche_desc"))
.laneDoorRampDesc(rs.getString("lane_door_ramp_desc"))
.spcTankDesc(rs.getString("spc_tank_desc"))
.tankDesc(rs.getString("tank_desc"))
.prmovrDesc(rs.getString("prmovr_desc"))
.prmovrOvrvwDesc(rs.getString("prmovr_ovrvw_desc"))
.auxDesc(rs.getString("aux_desc"))
.asstGnrtrDesc(rs.getString("asst_gnrtr_desc"))
.fuelDesc(rs.getString("fuel_desc"))
.docCompanyCd(rs.getString("doc_company_cd"))
.groupActlOwnrCompanyCd(rs.getString("group_actl_ownr_company_cd"))
.operator(rs.getString("operator"))
.operatorCompanyCd(rs.getString("operator_company_cd"))
.shipMngrCompanyCd(rs.getString("ship_mngr_company_cd"))
.techMngrCd(rs.getString("tech_mngr_cd"))
.regShponrCd(rs.getString("reg_shponr_cd"))
.lastMdfcnDt(rs.getString("last_mdfcn_dt"))
.build();
}, nextTargetId);
// 3. 해당 ID 'P' 변경
updateBatchProcessing(nextTargetId);
}
@Override
protected ShipInfoMstDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return ShipInfoMstDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.mmsiNo(rs.getString("mmsi_no"))
.shipNm(rs.getString("ship_nm"))
.clsgnNo(rs.getString("clsgn_no"))
.frmlaRegNo(rs.getString("frmla_reg_no"))
.fshrPrmtNo(rs.getString("fshr_prmt_no"))
.shipNtnlty(rs.getString("ship_ntnlty"))
.ntnltyCd(rs.getString("ntnlty_cd"))
.loadPort(rs.getString("load_port"))
.clfic(rs.getString("clfic"))
.clficDesc(rs.getString("clfic_desc"))
.shipStatus(rs.getString("ship_status"))
.shipTypeGroup(rs.getString("ship_type_group"))
.shipTypeLvTwo(rs.getString("ship_type_lv_two"))
.shipTypeLvThr(rs.getString("ship_type_lv_thr"))
.shipTypeLvFour(rs.getString("ship_type_lv_four"))
.shipTypeLvFive(rs.getString("ship_type_lv_five"))
.shipTypeLvFiveDtldType(rs.getString("ship_type_lv_five_dtld_type"))
.shipTypeLvFiveHullType(rs.getString("ship_type_lv_five_hull_type"))
.shipTypeLvFiveLwrnkGroup(rs.getString("ship_type_lv_five_lwrnk_group"))
.buildYy(rs.getString("build_yy"))
.buildYmd(rs.getString("build_ymd"))
.shpyrd(rs.getString("shpyrd"))
.shpyrdOffclNm(rs.getString("shpyrd_offcl_nm"))
.shpyrdBuildNo(rs.getString("shpyrd_build_no"))
.buildDesc(rs.getString("build_desc"))
.modfHstryDesc(rs.getString("modf_hstry_desc"))
.whlnthLoa(rs.getString("whlnth_loa"))
.regLength(rs.getString("reg_length"))
.lbp(rs.getString("lbp"))
.formnBreadth(rs.getString("formn_breadth"))
.maxBreadth(rs.getString("max_breadth"))
.depth(rs.getString("depth"))
.draft(rs.getString("draft"))
.keelMastHg(rs.getString("keel_mast_hg"))
.bulbBow(rs.getString("bulb_bow"))
.gt(rs.getString("gt"))
.ntTon(rs.getString("nt_ton"))
.dwt(rs.getString("dwt"))
.displacement(rs.getString("displacement"))
.lightDisplacementTon(rs.getString("light_displacement_ton"))
.cgt(rs.getString("cgt"))
.fldngOneCmPerTonTpci(rs.getString("fldng_one_cm_per_ton_tpci"))
.tonEfectDay(rs.getString("ton_efect_day"))
.calcfrmDwt(rs.getString("calcfrm_dwt"))
.teuCnt(rs.getString("teu_cnt"))
.teuCapacity(rs.getString("teu_capacity"))
.grainCapacityM3(rs.getString("grain_capacity_m3"))
.baleCapacity(rs.getString("bale_capacity"))
.liquidCapacity(rs.getString("liquid_capacity"))
.gasM3(rs.getString("gas_m3"))
.insulatedM3(rs.getString("insulated_m3"))
.passengerCapacity(rs.getString("passenger_capacity"))
.bollardPull(rs.getString("bollard_pull"))
.svcSpd(rs.getString("svc_spd"))
.mainEngineType(rs.getString("main_engine_type"))
.fuelCnsmpSpdOne(rs.getString("fuel_cnsmp_spd_one"))
.fuelCnsmpamtValOne(rs.getString("fuel_cnsmpamt_val_one"))
.fuelCnsmpSpdTwo(rs.getString("fuel_cnsmp_spd_two"))
.fuelCnsmpamtValTwo(rs.getString("fuel_cnsmpamt_val_two"))
.totalFuelCapacityM3(rs.getString("total_fuel_capacity_m3"))
.blrMftr(rs.getString("blr_mftr"))
.proplrMftr(rs.getString("proplr_mftr"))
.cargoCapacityM3Desc(rs.getString("cargo_capacity_m3_desc"))
.eqpmntDesc(rs.getString("eqpmnt_desc"))
.hdn(rs.getString("hdn"))
.hatcheDesc(rs.getString("hatche_desc"))
.laneDoorRampDesc(rs.getString("lane_door_ramp_desc"))
.spcTankDesc(rs.getString("spc_tank_desc"))
.tankDesc(rs.getString("tank_desc"))
.prmovrDesc(rs.getString("prmovr_desc"))
.prmovrOvrvwDesc(rs.getString("prmovr_ovrvw_desc"))
.auxDesc(rs.getString("aux_desc"))
.asstGnrtrDesc(rs.getString("asst_gnrtr_desc"))
.fuelDesc(rs.getString("fuel_desc"))
.docCompanyCd(rs.getString("doc_company_cd"))
.groupActlOwnrCompanyCd(rs.getString("group_actl_ownr_company_cd"))
.operator(rs.getString("operator"))
.operatorCompanyCd(rs.getString("operator_company_cd"))
.shipMngrCompanyCd(rs.getString("ship_mngr_company_cd"))
.techMngrCd(rs.getString("tech_mngr_cd"))
.regShponrCd(rs.getString("reg_shponr_cd"))
.lastMdfcnDt(rs.getString("last_mdfcn_dt"))
.build();
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceShipDetailData);
businessJdbcTemplate.update(sql, targetExecutionId);
}
}

파일 보기

@ -1,74 +1,38 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.ShipManagerHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class ShipManagerHistoryReader implements ItemReader<ShipManagerHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<ShipManagerHistoryDto> allDataBuffer = new ArrayList<>();
public class ShipManagerHistoryReader extends BaseSyncReader<ShipManagerHistoryDto> {
public ShipManagerHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public ShipManagerHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceShipManagerHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceShipManagerHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[ShipManagerHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceShipManagerHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return ShipManagerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipMngCompanySeq(rs.getString("ship_mng_company_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.shipMngrCd(rs.getString("ship_mngr_cd"))
.shipMngr(rs.getString("ship_mngr"))
.companyStatus(rs.getString("company_status"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceShipManagerHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected ShipManagerHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return ShipManagerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipMngCompanySeq(rs.getString("ship_mng_company_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.shipMngrCd(rs.getString("ship_mngr_cd"))
.shipMngr(rs.getString("ship_mngr"))
.companyStatus(rs.getString("company_status"))
.build();
}
}

파일 보기

@ -1,66 +1,34 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.SisterShipLinksDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class SisterShipLinksReader implements ItemReader<SisterShipLinksDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<SisterShipLinksDto> allDataBuffer = new ArrayList<>();
public class SisterShipLinksReader extends BaseSyncReader<SisterShipLinksDto> {
public SisterShipLinksReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public SisterShipLinksDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceSisterShipLinks;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceSisterShipLinks), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[SisterShipLinksReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceSisterShipLinks);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return SisterShipLinksDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.linkImoNo(rs.getString("link_imo_no"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceSisterShipLinks);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected SisterShipLinksDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return SisterShipLinksDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.linkImoNo(rs.getString("link_imo_no"))
.build();
}
}

파일 보기

@ -1,68 +1,36 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.SpecialFeatureDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class SpecialFeatureReader implements ItemReader<SpecialFeatureDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<SpecialFeatureDto> allDataBuffer = new ArrayList<>();
public class SpecialFeatureReader extends BaseSyncReader<SpecialFeatureDto> {
public SpecialFeatureReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public SpecialFeatureDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceSpecialFeature;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceSpecialFeature), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[SpecialFeatureReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceSpecialFeature);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return SpecialFeatureDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipSpcFetrSeq(rs.getString("ship_spc_fetr_seq"))
.spcMttrCd(rs.getString("spc_mttr_cd"))
.spcMttr(rs.getString("spc_mttr"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceSpecialFeature);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected SpecialFeatureDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return SpecialFeatureDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipSpcFetrSeq(rs.getString("ship_spc_fetr_seq"))
.spcMttrCd(rs.getString("spc_mttr_cd"))
.spcMttr(rs.getString("spc_mttr"))
.build();
}
}

파일 보기

@ -1,69 +1,37 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.StatusHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class StatusHistoryReader implements ItemReader<StatusHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<StatusHistoryDto> allDataBuffer = new ArrayList<>();
public class StatusHistoryReader extends BaseSyncReader<StatusHistoryDto> {
public StatusHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public StatusHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceStatusHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceStatusHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[StatusHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceStatusHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return StatusHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipStatusHstrySeq(rs.getString("ship_status_hstry_seq"))
.statusCd(rs.getString("status_cd"))
.statusChgYmd(rs.getString("status_chg_ymd"))
.status(rs.getString("status"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceStatusHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected StatusHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return StatusHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipStatusHstrySeq(rs.getString("ship_status_hstry_seq"))
.statusCd(rs.getString("status_cd"))
.statusChgYmd(rs.getString("status_chg_ymd"))
.status(rs.getString("status"))
.build();
}
}

파일 보기

@ -1,70 +1,38 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.StowageCommodityDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class StowageCommodityReader implements ItemReader<StowageCommodityDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<StowageCommodityDto> allDataBuffer = new ArrayList<>();
public class StowageCommodityReader extends BaseSyncReader<StowageCommodityDto> {
public StowageCommodityReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public StowageCommodityDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceStowageCommodity;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceStowageCommodity), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[StowageCommodityReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceStowageCommodity);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return StowageCommodityDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipCargoCapacitySeq(rs.getString("ship_cargo_capacity_seq"))
.capacityCd(rs.getString("capacity_cd"))
.capacityCdDesc(rs.getString("capacity_cd_desc"))
.cargoCd(rs.getString("cargo_cd"))
.cargoNm(rs.getString("cargo_nm"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceStowageCommodity);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected StowageCommodityDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return StowageCommodityDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipCargoCapacitySeq(rs.getString("ship_cargo_capacity_seq"))
.capacityCd(rs.getString("capacity_cd"))
.capacityCdDesc(rs.getString("capacity_cd_desc"))
.cargoCd(rs.getString("cargo_cd"))
.cargoNm(rs.getString("cargo_nm"))
.build();
}
}

파일 보기

@ -1,69 +1,37 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.SurveyDatesHistoryUniqueDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class SurveyDatesHistoryUniqueReader implements ItemReader<SurveyDatesHistoryUniqueDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<SurveyDatesHistoryUniqueDto> allDataBuffer = new ArrayList<>();
public class SurveyDatesHistoryUniqueReader extends BaseSyncReader<SurveyDatesHistoryUniqueDto> {
public SurveyDatesHistoryUniqueReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public SurveyDatesHistoryUniqueDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceSurveyDatesHistoryUnique;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceSurveyDatesHistoryUnique), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[SurveyDatesHistoryUniqueReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceSurveyDatesHistoryUnique);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return SurveyDatesHistoryUniqueDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.clficCd(rs.getString("clfic_cd"))
.inspectionType(rs.getString("inspection_type"))
.inspectionYmd(rs.getString("inspection_ymd"))
.clfic(rs.getString("clfic"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceSurveyDatesHistoryUnique);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected SurveyDatesHistoryUniqueDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return SurveyDatesHistoryUniqueDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.clficCd(rs.getString("clfic_cd"))
.inspectionType(rs.getString("inspection_type"))
.inspectionYmd(rs.getString("inspection_ymd"))
.clfic(rs.getString("clfic"))
.build();
}
}

파일 보기

@ -1,72 +1,40 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.SurveyDatesDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class SurveyDatesReader implements ItemReader<SurveyDatesDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<SurveyDatesDto> allDataBuffer = new ArrayList<>();
public class SurveyDatesReader extends BaseSyncReader<SurveyDatesDto> {
public SurveyDatesReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public SurveyDatesDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceSurveyDates;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceSurveyDates), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[SurveyDatesReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceSurveyDates);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return SurveyDatesDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.clficCd(rs.getString("clfic_cd"))
.clfic(rs.getString("clfic"))
.dckngInspection(rs.getString("dckng_inspection"))
.fxtmInspection(rs.getString("fxtm_inspection"))
.annualInspection(rs.getString("annual_inspection"))
.mchnFxtmInspectionYmd(rs.getString("mchn_fxtm_inspection_ymd"))
.tlsftInspectionYmd(rs.getString("tlsft_inspection_ymd"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceSurveyDates);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected SurveyDatesDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return SurveyDatesDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.clficCd(rs.getString("clfic_cd"))
.clfic(rs.getString("clfic"))
.dckngInspection(rs.getString("dckng_inspection"))
.fxtmInspection(rs.getString("fxtm_inspection"))
.annualInspection(rs.getString("annual_inspection"))
.mchnFxtmInspectionYmd(rs.getString("mchn_fxtm_inspection_ymd"))
.tlsftInspectionYmd(rs.getString("tlsft_inspection_ymd"))
.build();
}
}

파일 보기

@ -1,93 +1,61 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.TbCompanyDetailDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class TbCompanyDetailReader implements ItemReader<TbCompanyDetailDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<TbCompanyDetailDto> allDataBuffer = new ArrayList<>();
public class TbCompanyDetailReader extends BaseSyncReader<TbCompanyDetailDto> {
public TbCompanyDetailReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public TbCompanyDetailDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTbCompanyDetail;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTbCompanyDetail), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[TbCompanyDetailReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTbCompanyDetail);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return TbCompanyDetailDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.companyCd(rs.getString("company_cd"))
.lastUpdYmd(rs.getString("last_upd_ymd"))
.careCd(rs.getString("care_cd"))
.companyStatus(rs.getString("company_status"))
.fullNm(rs.getString("full_nm"))
.companyNameAbbr(rs.getString("company_name_abbr"))
.companyFndnYmd(rs.getString("company_fndn_ymd"))
.prntCompanyCd(rs.getString("prnt_company_cd"))
.countryNm(rs.getString("country_nm"))
.ctyNm(rs.getString("cty_nm"))
.oaAddr(rs.getString("oa_addr"))
.emlAddr(rs.getString("eml_addr"))
.tel(rs.getString("tel"))
.faxNo(rs.getString("fax_no"))
.wbstUrl(rs.getString("wbst_url"))
.countryCtrl(rs.getString("country_ctrl"))
.countryCtrlCd(rs.getString("country_ctrl_cd"))
.countryReg(rs.getString("country_reg"))
.countryRegCd(rs.getString("country_reg_cd"))
.regionCd(rs.getString("region_cd"))
.distNm(rs.getString("dist_nm"))
.distNo(rs.getString("dist_no"))
.mailAddrRear(rs.getString("mail_addr_rear"))
.mailAddrFrnt(rs.getString("mail_addr_frnt"))
.poBox(rs.getString("po_box"))
.dtlAddrOne(rs.getString("dtl_addr_one"))
.dtlAddrTwo(rs.getString("dtl_addr_two"))
.dtlAddrThr(rs.getString("dtl_addr_thr"))
.tlx(rs.getString("tlx"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTbCompanyDetail);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected TbCompanyDetailDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return TbCompanyDetailDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.companyCd(rs.getString("company_cd"))
.lastUpdYmd(rs.getString("last_upd_ymd"))
.careCd(rs.getString("care_cd"))
.companyStatus(rs.getString("company_status"))
.fullNm(rs.getString("full_nm"))
.companyNameAbbr(rs.getString("company_name_abbr"))
.companyFndnYmd(rs.getString("company_fndn_ymd"))
.prntCompanyCd(rs.getString("prnt_company_cd"))
.countryNm(rs.getString("country_nm"))
.ctyNm(rs.getString("cty_nm"))
.oaAddr(rs.getString("oa_addr"))
.emlAddr(rs.getString("eml_addr"))
.tel(rs.getString("tel"))
.faxNo(rs.getString("fax_no"))
.wbstUrl(rs.getString("wbst_url"))
.countryCtrl(rs.getString("country_ctrl"))
.countryCtrlCd(rs.getString("country_ctrl_cd"))
.countryReg(rs.getString("country_reg"))
.countryRegCd(rs.getString("country_reg_cd"))
.regionCd(rs.getString("region_cd"))
.distNm(rs.getString("dist_nm"))
.distNo(rs.getString("dist_no"))
.mailAddrRear(rs.getString("mail_addr_rear"))
.mailAddrFrnt(rs.getString("mail_addr_frnt"))
.poBox(rs.getString("po_box"))
.dtlAddrOne(rs.getString("dtl_addr_one"))
.dtlAddrTwo(rs.getString("dtl_addr_two"))
.dtlAddrThr(rs.getString("dtl_addr_thr"))
.tlx(rs.getString("tlx"))
.build();
}
}

파일 보기

@ -1,70 +1,38 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.TechnicalManagerHistoryDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class TechnicalManagerHistoryReader implements ItemReader<TechnicalManagerHistoryDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<TechnicalManagerHistoryDto> allDataBuffer = new ArrayList<>();
public class TechnicalManagerHistoryReader extends BaseSyncReader<TechnicalManagerHistoryDto> {
public TechnicalManagerHistoryReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public TechnicalManagerHistoryDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceTechnicalManagerHistory;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceTechnicalManagerHistory), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[TechnicalManagerHistoryReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceTechnicalManagerHistory);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return TechnicalManagerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipTechMngCompanySeq(rs.getString("ship_tech_mng_company_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.techMngrCd(rs.getString("tech_mngr_cd"))
.techMngr(rs.getString("tech_mngr"))
.companyStatus(rs.getString("company_status"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceTechnicalManagerHistory);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected TechnicalManagerHistoryDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return TechnicalManagerHistoryDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.shipTechMngCompanySeq(rs.getString("ship_tech_mng_company_seq"))
.efectStaDay(rs.getString("efect_sta_day"))
.techMngrCd(rs.getString("tech_mngr_cd"))
.techMngr(rs.getString("tech_mngr"))
.companyStatus(rs.getString("company_status"))
.build();
}
}

파일 보기

@ -1,73 +1,41 @@
package com.snp.batch.jobs.datasync.batch.ship.reader;
import com.snp.batch.common.util.CommonSql;
import com.snp.batch.common.batch.reader.BaseSyncReader;
import com.snp.batch.common.util.TableMetaInfo;
import com.snp.batch.jobs.datasync.batch.ship.dto.ThrustersDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.item.ItemReader;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.util.ArrayList;
import java.util.List;
import java.sql.ResultSet;
import java.sql.SQLException;
@Slf4j
public class ThrustersReader implements ItemReader<ThrustersDto> {
private final TableMetaInfo tableMetaInfo;
private final JdbcTemplate businessJdbcTemplate;
private List<ThrustersDto> allDataBuffer = new ArrayList<>();
public class ThrustersReader extends BaseSyncReader<ThrustersDto> {
public ThrustersReader(@Qualifier("businessDataSource") DataSource businessDataSource, TableMetaInfo tableMetaInfo) {
this.businessJdbcTemplate = new JdbcTemplate(businessDataSource);
this.tableMetaInfo = tableMetaInfo;
super(businessDataSource, tableMetaInfo);
}
@Override
public ThrustersDto read() throws Exception {
if (allDataBuffer.isEmpty()) {
fetchNextGroup();
}
if (allDataBuffer.isEmpty()) {
return null;
}
return allDataBuffer.remove(0);
protected String getSourceTable() {
return tableMetaInfo.sourceThrusters;
}
private void fetchNextGroup() {
Long nextTargetId = null;
try {
nextTargetId = businessJdbcTemplate.queryForObject(
CommonSql.getNextTargetQuery(tableMetaInfo.sourceThrusters), Long.class);
} catch (Exception e) {
return;
}
if (nextTargetId != null) {
log.info("[ThrustersReader] 다음 처리 대상 ID 발견: {}", nextTargetId);
String sql = CommonSql.getTargetDataQuery(tableMetaInfo.sourceThrusters);
final Long targetId = nextTargetId;
this.allDataBuffer = businessJdbcTemplate.query(sql, (rs, rowNum) -> {
return ThrustersDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.thrstrSeq(rs.getString("thrstr_seq"))
.thrstrTypeCd(rs.getString("thrstr_type_cd"))
.thrstrType(rs.getString("thrstr_type"))
.thrstrCnt(rs.getBigDecimal("thrstr_cnt"))
.thrstrPosition(rs.getString("thrstr_position"))
.thrstrPowerBhp(rs.getBigDecimal("thrstr_power_bhp"))
.thrstrPowerKw(rs.getBigDecimal("thrstr_power_kw"))
.instlMth(rs.getString("instl_mth"))
.build();
}, nextTargetId);
updateBatchProcessing(nextTargetId);
}
}
private void updateBatchProcessing(Long targetExecutionId) {
String sql = CommonSql.getProcessBatchQuery(tableMetaInfo.sourceThrusters);
businessJdbcTemplate.update(sql, targetExecutionId);
@Override
protected ThrustersDto mapRow(ResultSet rs, Long targetId) throws SQLException {
return ThrustersDto.builder()
.jobExecutionId(targetId)
.datasetVer(rs.getString("dataset_ver"))
.imoNo(rs.getString("imo_no"))
.thrstrSeq(rs.getString("thrstr_seq"))
.thrstrTypeCd(rs.getString("thrstr_type_cd"))
.thrstrType(rs.getString("thrstr_type"))
.thrstrCnt(rs.getBigDecimal("thrstr_cnt"))
.thrstrPosition(rs.getString("thrstr_position"))
.thrstrPowerBhp(rs.getBigDecimal("thrstr_power_bhp"))
.thrstrPowerKw(rs.getBigDecimal("thrstr_power_kw"))
.instlMth(rs.getString("instl_mth"))
.build();
}
}

파일 보기

@ -142,59 +142,6 @@ public class ShipDataSql {
""".formatted(TARGET_SCHEMA, targetTable);
}
public static String getShipMainInfoUpsertSql(String targetTable) {
return """
INSERT INTO %s.%s (
crt_dt, creatr_id,
imo_no, mmsi_no, ship_nm, clsgn_no, country_nm,
ship_reg_hrbr, clfic_asctn_nm, ship_knd_lv_five, ship_knd_dtl_lv_five,
ship_build_yy, shpyrd_nm, ship_whlnth, ship_molbth, ship_depth, ship_draft,
ship_total_ton, dwt, cntnr_units, svc_crspd, main_engine_fom,
ship_status, ship_operator, ship_country_cd, ship_knd_lv_two, cargo_type,
last_mdfcn_dt
)
VALUES (
CURRENT_TIMESTAMP, 'SYSTEM',
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?
)
ON CONFLICT (imo_no)
DO UPDATE SET
mdfcn_dt = CURRENT_TIMESTAMP,
mdfr_id = 'SYSTEM',
mmsi_no = EXCLUDED.mmsi_no,
ship_nm = EXCLUDED.ship_nm,
clsgn_no = EXCLUDED.clsgn_no,
country_nm = EXCLUDED.country_nm,
ship_reg_hrbr = EXCLUDED.ship_reg_hrbr,
clfic_asctn_nm = EXCLUDED.clfic_asctn_nm,
ship_knd_lv_five = EXCLUDED.ship_knd_lv_five,
ship_knd_dtl_lv_five = EXCLUDED.ship_knd_dtl_lv_five,
ship_build_yy = EXCLUDED.ship_build_yy,
shpyrd_nm = EXCLUDED.shpyrd_nm,
ship_whlnth = EXCLUDED.ship_whlnth,
ship_molbth = EXCLUDED.ship_molbth,
ship_depth = EXCLUDED.ship_depth,
ship_draft = EXCLUDED.ship_draft,
ship_total_ton = EXCLUDED.ship_total_ton,
dwt = EXCLUDED.dwt,
cntnr_units = EXCLUDED.cntnr_units,
svc_crspd = EXCLUDED.svc_crspd,
main_engine_fom = EXCLUDED.main_engine_fom,
ship_status = EXCLUDED.ship_status,
ship_operator = EXCLUDED.ship_operator,
ship_country_cd = EXCLUDED.ship_country_cd,
ship_knd_lv_two = EXCLUDED.ship_knd_lv_two,
cargo_type = EXCLUDED.cargo_type,
last_mdfcn_dt = EXCLUDED.last_mdfcn_dt;
""".formatted(TARGET_SCHEMA, targetTable);
}
public static String getShipAddInfoUpsertSql(String targetTable) {
return """
INSERT INTO %s.%s (

파일 보기

@ -35,7 +35,6 @@ import java.util.List;
*/
public interface ShipRepository {
void saveShipInfoMst(List<ShipInfoMstEntity> shipInfoMstEntityList);
void saveShipMainInfo(List<ShipInfoMstEntity> shipInfoMstEntityList);
void saveShipAddInfo(List<ShipAddInfoEntity> shipAddInfoEntityList);
void saveBareboatCharterHistory(List<BareboatCharterHistoryEntity> bareboatCharterHistoryEntityList);
void saveCallsignAndMmsiHistory(List<CallsignAndMmsiHistoryEntity> callsignAndMmsiHistoryEntityList);

Some files were not shown because too many files have changed in this diff Show More