Skip to content
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 30 additions & 0 deletions packages/cli/src/modules/data-table/data-table.controller.ts
Original file line number Diff line number Diff line change
Expand Up @@ -261,6 +261,36 @@ export class DataTableController {
}
}

@Get('/:dataTableId/download-csv')
@ProjectScope('dataTable:read')
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This endpoint returns full table data but is guarded only by dataTable:read, letting callers who can see metadata download entire datasets without the dataTable:readRow permission that other row-returning routes require. Align the scope with existing row access to prevent unauthorized data export.

Prompt for AI agents
Address the following comment on packages/cli/src/modules/data-table/data-table.controller.ts at line 265:

<comment>This endpoint returns full table data but is guarded only by `dataTable:read`, letting callers who can see metadata download entire datasets without the `dataTable:readRow` permission that other row-returning routes require. Align the scope with existing row access to prevent unauthorized data export.</comment>

<file context>
@@ -261,6 +261,36 @@ export class DataTableController {
 	}
 
+	@Get(&#39;/:dataTableId/download-csv&#39;)
+	@ProjectScope(&#39;dataTable:read&#39;)
+	async downloadDataTableCsv(
+		req: AuthenticatedRequest&lt;{ projectId: string; dataTableId: string }&gt;,
</file context>
Suggested change
@ProjectScope('dataTable:read')
@ProjectScope('dataTable:readRow')
Fix with Cubic

async downloadDataTableCsv(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a integration test for this endpoint? that will automatically add coverage to all the backend code (or most of it)

req: AuthenticatedRequest<{ projectId: string; dataTableId: string }>,
_res: Response,
) {
try {
const { projectId, dataTableId } = req.params;

// Generate CSV content - this will validate that the table exists
const { csvContent, dataTableName } = await this.dataTableService.generateDataTableCsv(
dataTableId,
projectId,
);

return {
csvContent,
dataTableName,
};
} catch (e: unknown) {
if (e instanceof DataTableNotFoundError) {
throw new NotFoundError(e.message);
} else if (e instanceof Error) {
throw new InternalServerError(e.message, e);
} else {
throw e;
}
}
}

/**
* @returns the IDs of the inserted rows
*/
Expand Down
111 changes: 111 additions & 0 deletions packages/cli/src/modules/data-table/data-table.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -688,4 +688,115 @@ export class DataTableService {
dataTables: accessibleDataTables,
};
}

async generateDataTableCsv(
dataTableId: string,
projectId: string,
): Promise<{ csvContent: string; dataTableName: string }> {
const dataTable = await this.validateDataTableExists(dataTableId, projectId);

// Fetch columns (ordered by index)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nick: I guess we do not need all these comments?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same with the comments below.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Claude was very generous with the comments. Cleaned them out.

const columns = await this.dataTableColumnRepository.getColumns(dataTableId);

// Fetch ALL rows (no pagination for export)
const { data: rows } = await this.dataTableRowsRepository.getManyAndCount(
dataTableId,
{
skip: 0,
take: -1, // Get all rows
},
columns,
);

// Build CSV
const csvContent = this.buildCsvContent(rows, columns);

return {
csvContent,
dataTableName: dataTable.name,
};
}

private buildCsvContent(rows: DataTableRowReturn[], columns: DataTableColumn[]): string {
// Sort columns once to avoid repeated sorting in the row loop
const sortedColumns = [...columns].sort((a, b) => a.index - b.index);

// Build header row: id + user columns + createdAt/updatedAt at the end
const userHeaders = sortedColumns.map((col) => col.name);
const headers = ['id', ...userHeaders, 'createdAt', 'updatedAt'];

// Escape and join headers
const csvRows: string[] = [headers.map((h) => this.escapeCsvValue(h)).join(',')];

// Build data rows
for (const row of rows) {
const values: string[] = [];

// Add id first
values.push(this.escapeCsvValue(row.id));

// Add user column values (in correct order)
for (const column of sortedColumns) {
const value = row[column.name];
values.push(this.escapeCsvValue(this.formatValueForCsv(value, column.type)));
}

// Add createdAt and updatedAt at the end
values.push(this.escapeCsvValue(this.formatDateForCsv(row.createdAt)));
values.push(this.escapeCsvValue(this.formatDateForCsv(row.updatedAt)));

csvRows.push(values.join(','));
}

return csvRows.join('\n');
}

private formatValueForCsv(value: unknown, columnType: DataTableColumnType): string {
// Handle NULL/undefined
if (value === null || value === undefined) {
return '';
}

// Handle dates - always use ISO format for CSV
if (columnType === 'date') {
if (value instanceof Date || typeof value === 'string') {
return this.formatDateForCsv(value);
}
}

// Handle booleans - already normalized to true/false by normalizeRows
if (columnType === 'boolean') {
return String(value);
}

// Handle numbers
if (columnType === 'number') {
return String(value);
}

// Handle strings and everything else
return String(value);
}

private formatDateForCsv(date: Date | string): string {
if (date instanceof Date) {
return date.toISOString();
}
// If it's already a string, try to parse and format
const parsed = new Date(date);
return !isNaN(parsed.getTime()) ? parsed.toISOString() : String(date);
}

private escapeCsvValue(value: unknown): string {
const str = String(value);

// RFC 4180 compliant escaping:
// - If value contains comma, quote, or newline, wrap in quotes
// - Escape quotes by doubling them
if (str.includes(',') || str.includes('"') || str.includes('\n') || str.includes('\r')) {
return `"${str.replace(/"/g, '""')}"`;
}

return str;
}
Comment on lines 775 to 795
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like leading/trailing whitespace won't be preserved. Was this intentional? I guess we should keep the original data as it comes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. Made the change so that if there is leading or trailing whitespace, the string will be in quotes. This helps import to google sheets.

}
2 changes: 2 additions & 0 deletions packages/frontend/@n8n/i18n/src/locales/en.json
Original file line number Diff line number Diff line change
Expand Up @@ -3288,6 +3288,8 @@
"dataTable.import.invalidColumnName": "Only alphabetical and non-leading numbers and underscores allowed",
"dataTable.delete.confirm.message": "Are you sure you want to delete the data table '{name}'? This action cannot be undone.",
"dataTable.delete.error": "Error deleting data table",
"dataTable.download.csv": "Download CSV",
"dataTable.download.error": "Failed to download data table",
"dataTable.rename.error": "Error renaming data table",
"dataTable.getDetails.error": "Error fetching data table details",
"dataTable.notFound": "Data table not found",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,11 @@ const telemetry = useTelemetry();

const actions = computed<Array<UserAction<IUser>>>(() => {
const availableActions = [
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm surprised that we added this here and not next to the search input

label: i18n.baseText('dataTable.download.csv'),
value: DATA_TABLE_CARD_ACTIONS.DOWNLOAD_CSV,
disabled: false,
},
{
label: i18n.baseText('generic.delete'),
value: DATA_TABLE_CARD_ACTIONS.DELETE,
Expand Down Expand Up @@ -67,6 +72,10 @@ const onAction = async (action: string) => {
});
break;
}
case DATA_TABLE_CARD_ACTIONS.DOWNLOAD_CSV: {
await downloadDataTableCsv();
break;
}
case DATA_TABLE_CARD_ACTIONS.DELETE: {
const promptResponse = await message.confirm(
i18n.baseText('dataTable.delete.confirm.message', {
Expand All @@ -86,6 +95,19 @@ const onAction = async (action: string) => {
}
};

const downloadDataTableCsv = async () => {
try {
await dataTableStore.downloadDataTableCsv(props.dataTable.id, props.dataTable.projectId);

telemetry.track('User downloaded data table CSV', {
data_table_id: props.dataTable.id,
data_table_project_id: props.dataTable.projectId,
});
} catch (error) {
toast.showError(error, i18n.baseText('dataTable.download.error'));
}
};

const deleteDataTable = async () => {
try {
const deleted = await dataTableStore.deleteDataTable(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ export const DATA_TABLE_CARD_ACTIONS = {
RENAME: 'rename',
DELETE: 'delete',
CLEAR: 'clear',
DOWNLOAD_CSV: 'download-csv',
};

export const ADD_DATA_TABLE_MODAL_KEY = 'addDataTableModal';
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -223,6 +223,25 @@ export const fetchDataTableGlobalLimitInBytes = async (context: IRestApiContext)
);
};

export const downloadDataTableCsvApi = async (
context: IRestApiContext,
dataTableId: string,
projectId: string,
): Promise<{ csvContent: string; filename: string }> => {
const response = await makeRestApiRequest<{ csvContent: string; dataTableName: string }>(
context,
'GET',
`/projects/${projectId}/data-tables/${dataTableId}/download-csv`,
);

// Use just the data table name as filename
const filename = `${response.dataTableName}.csv`;

return {
csvContent: response.csvContent,
filename,
};
};
export const uploadCsvFileApi = async (
context: IRestApiContext,
file: File,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ import {
updateDataTableRowsApi,
deleteDataTableRowsApi,
fetchDataTableGlobalLimitInBytes,
downloadDataTableCsvApi,
uploadCsvFileApi,
} from '@/features/core/dataTable/dataTable.api';
import type {
Expand Down Expand Up @@ -273,6 +274,36 @@ export const useDataTableStore = defineStore(DATA_TABLE_STORE, () => {
return result;
};

const downloadDataTableCsv = async (dataTableId: string, projectId: string) => {
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rule violated: Tests

downloadDataTableCsv introduces a new CSV export path (API call plus DOM-triggered download) but no unit/UI tests were added to cover it, violating the Community PR Guidelines §2 testing requirement for new functionality.

Prompt for AI agents
Address the following comment on packages/frontend/editor-ui/src/features/core/dataTable/dataTable.store.ts at line 277:

<comment>`downloadDataTableCsv` introduces a new CSV export path (API call plus DOM-triggered download) but no unit/UI tests were added to cover it, violating the Community PR Guidelines §2 testing requirement for new functionality.</comment>

<file context>
@@ -273,6 +274,36 @@ export const useDataTableStore = defineStore(DATA_TABLE_STORE, () =&gt; {
 		return result;
 	};
 
+	const downloadDataTableCsv = async (dataTableId: string, projectId: string) =&gt; {
+		// Fetch CSV content with authentication
+		const { csvContent, filename } = await downloadDataTableCsvApi(
</file context>
Fix with Cubic

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can make this readable like this:

const UTF8_BOM = '\uFEFF';

const createCsvBlob = (csvContent: string): Blob => {
	// Add BOM for Excel compatibility with special characters
	return new Blob([UTF8_BOM + csvContent], { 
		type: 'text/csv;charset=utf-8;' 
	});
};

const triggerBrowserDownload = (blob: Blob, filename: string): void => {
	const url = URL.createObjectURL(blob);
	const link = document.createElement('a');
	
	link.href = url;
	link.download = filename;
	link.style.display = 'none';
	
	document.body.appendChild(link);
	
	try {
		link.click();
	} finally {
		document.body.removeChild(link);
		URL.revokeObjectURL(url);
	}
};

const downloadDataTableCsv = async (dataTableId: string, projectId: string) => {
	const { csvContent, filename } = await downloadDataTableCsvApi(
		rootStore.restApiContext,
		dataTableId,
		projectId,
	);
	
	const csvBlob = createCsvBlob(csvContent);
	triggerBrowserDownload(csvBlob, filename);
};

// Fetch CSV content with authentication
const { csvContent, filename } = await downloadDataTableCsvApi(
rootStore.restApiContext,
dataTableId,
projectId,
);

// Create blob with UTF-8 BOM for Excel compatibility
const bom = '\uFEFF';
const blob = new Blob([bom + csvContent], { type: 'text/csv;charset=utf-8;' });
const url = URL.createObjectURL(blob);

const tempElement = document.createElement('a');
tempElement.setAttribute('href', url);
tempElement.setAttribute('download', filename);
tempElement.style.display = 'none';
document.body.appendChild(tempElement);

try {
tempElement.click();
} finally {
// Ensure cleanup happens even if click fails
if (document.body.contains(tempElement)) {
document.body.removeChild(tempElement);
}
URL.revokeObjectURL(url);
}
};

return {
dataTables,
totalCount,
Expand All @@ -295,6 +326,7 @@ export const useDataTableStore = defineStore(DATA_TABLE_STORE, () => {
insertEmptyRow,
updateRow,
deleteRows,
downloadDataTableCsv,
projectPermissions,
};
});