Skip to content

Commit bb7af8a

Browse files
authored
chore(server): writing import project readme (#1851)
1 parent ae53cd7 commit bb7af8a

File tree

1 file changed

+149
-0
lines changed

1 file changed

+149
-0
lines changed

server/README.md

Lines changed: 149 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,3 +37,152 @@ curl -X POST http://localhost:4443/storage/v1/b\?project\=your-project-id \
3737
3. set `REEARTH_GCS_BUCKETNAME` to `test-bucket`
3838

3939
※ project name and test name is anything you want
40+
41+
## Project Export and Import
42+
43+
Re:Earth provides functionality to export and import complete projects including all associated data.
44+
45+
### Export
46+
47+
Projects can be exported via the GraphQL API using the `ExportProject` mutation.
48+
49+
#### Export Process Flow
50+
51+
1. **Create temporary zip file** - A zip archive is created with project ID as filename
52+
2. **Export project data** - Project metadata and settings
53+
3. **Export scene data** - Scene configuration and visualization settings
54+
4. **Export plugins** - All plugins used in the project
55+
5. **Export assets** - Images, 3D models, and other assets
56+
6. **Add metadata** - Export information including:
57+
- `host`: Current host URL
58+
- `project`: Project ID
59+
- `timestamp`: Export timestamp (RFC3339 format)
60+
- `exportDataVersion`: Data format version (current: `"1"`)
61+
7. **Upload to storage** - The completed zip file is uploaded to configured storage
62+
8. **Return path** - Returns download path: `/export/{projectId}.zip`
63+
64+
#### Exported Zip Structure
65+
66+
```
67+
project.zip
68+
├── project.json # Complete project data with metadata
69+
├── assets/ # Project assets
70+
│ ├── image1.png
71+
│ └── model.gltf
72+
└── plugins/ # Plugin files
73+
└── plugin1/
74+
```
75+
76+
#### Export Data Version
77+
78+
The `exportDataVersion` field enables compatibility management for future format changes:
79+
- Current version: `"1"`
80+
- Version is embedded in `exportedInfo` section of `project.json`
81+
- Future versions can support schema migrations and new features
82+
83+
**File**: `internal/adapter/gql/resolver_mutation_project.go:149`
84+
85+
### Import
86+
87+
Projects can be imported via two methods:
88+
89+
#### 1. Split Upload API (`POST /api/split-import`)
90+
91+
Handles chunked file uploads for large project files.
92+
93+
**Process Flow**:
94+
95+
1. **Chunk Upload** - Client uploads file in chunks (16MB each)
96+
2. **Session Management** - Server tracks upload progress per file ID
97+
3. **Temporary Project Creation** - On first chunk, creates placeholder project with status `UPLOADING`
98+
4. **Chunk Assembly** - When all chunks received, assembles complete file
99+
5. **Async Processing** - Spawns goroutine to process import
100+
6. **Import Execution** - Calls `ImportProject()` with assembled data
101+
102+
**File**: `internal/app/file_split_uploader.go:69`
103+
104+
#### 2. Storage Trigger API (`POST /api/import-project`)
105+
106+
Triggered automatically when a project zip file is uploaded directly to storage (e.g., GCS/S3 bucket notification).
107+
108+
**Authentication**:
109+
- No auth token required (triggered by storage service)
110+
- User context extracted from filename: `{workspaceId}-{projectId}-{userId}.zip`
111+
- Operator context automatically generated from user ID
112+
113+
**File**: `internal/app/file_import_common.go:95`
114+
115+
### ImportProject() Implementation
116+
117+
Core import logic that processes the extracted project data.
118+
119+
**Processing Order**:
120+
121+
1. **Project Data** - `ImportProjectData()` - Project metadata and configuration
122+
2. **Assets** - `ImportAssetFiles()` - Upload and register asset files
123+
3. **Scene Creation** - Create new scene for imported project
124+
4. **ID Replacement** - Replace old scene ID with new scene ID throughout data
125+
5. **Plugins** - `ImportPlugins()` - Install required plugins and schemas
126+
6. **Scene Data** - `ImportSceneData()` - Scene configuration and layers
127+
7. **Styles** - `ImportStyles()` - Layer styling information
128+
8. **NLS Layers** - `ImportNLSLayers()` - New layer system data
129+
9. **Story** - `ImportStory()` - Storytelling configuration
130+
10. **Status Update** - Mark import as `SUCCESS` or `FAILED`
131+
132+
**Version Handling**:
133+
134+
The `version` parameter (from `exportDataVersion`) enables format-specific processing:
135+
136+
```go
137+
func ImportProject(
138+
ctx context.Context,
139+
usecases *interfaces.Container,
140+
op *usecase.Operator,
141+
wsId accountdomain.WorkspaceID,
142+
pid id.ProjectID,
143+
importData *[]byte,
144+
assetsZip map[string]*zip.File,
145+
pluginsZip map[string]*zip.File,
146+
result map[string]any,
147+
version *string, // Export data version for compatibility
148+
) bool
149+
```
150+
151+
**Current Implementation**:
152+
- Version `"1"` is the current format
153+
- Version parameter is extracted but not yet used for branching
154+
- Future versions can implement migration logic based on version value
155+
156+
**Future Usage Example**:
157+
```go
158+
if version != nil && *version == "2" {
159+
// Handle version 2 format with new features
160+
return importV2(...)
161+
}
162+
// Default to version 1 processing
163+
return importV1(...)
164+
```
165+
166+
**File**: `internal/app/file_import_common.go:193`
167+
168+
### Error Handling
169+
170+
All import steps update project status on failure:
171+
- Status: `ProjectImportStatusFailed`
172+
- Error message logged to `importResultLog`
173+
- Processing stops at first error
174+
175+
### Import Status Values
176+
177+
- `ProjectImportStatusNone` - Not imported
178+
- `ProjectImportStatusUploading` - Upload in progress
179+
- `ProjectImportStatusSuccess` - Import completed successfully
180+
- `ProjectImportStatusFailed` - Import failed (check `importResultLog`)
181+
182+
### Configuration
183+
184+
**File Size Limit**: 500MB (enforced in `pkg/file/zip.go:114`)
185+
186+
**Chunk Size**: 16MB (split upload)
187+
188+
**Cleanup**: Stale upload sessions (>24 hours) are automatically cleaned up

0 commit comments

Comments
 (0)