Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.frontic.com/llms.txt

Use this file to discover all available pages before exploring further.

The CSV Import pulls CSV files from an SFTP server you control and turns each row into a record on a Data Feed. Reach for it when your source system is a nightly export, a partner-supplied feed, or any process that drops a file on a server but doesn’t expose an API.
Auth
SFTP credentials
Update methods
Trigger · Polling
Resources
Content (one record per row)
The CSV Import follows the standard integration model — see how integrations work for Connection, Channels, and Data Feeds. This page covers the SFTP-specific configuration and the row-to-record translation.

Install and setup

There’s no plugin to install. You provide an SFTP endpoint and Frontic does the rest.

Decide where files will live

Pick a directory on an SFTP server Frontic can reach. Best practice: a dedicated user with read+delete on a single base path, not your whole filesystem.

Add the integration in Frontic

In Frontic admin, Integrations → Add → CSV Import. Enter the SFTP host, port, username, password, and base path.

Test the connection

Frontic opens a connection, lists the base path, and confirms it reached the server. Fix credentials or firewall rules until this test passes.

Add channels and feeds

Configure at least one channel (declares which project locales the rows resolve into) and one Data Feed per filename pattern you want to ingest. See the sections below.

Connection settings

Use SFTP
boolean
required
Currently the only supported transport. Reserved for future protocols.
Host
string
required
SFTP server hostname or IP.
Port
string
default:"22"
SFTP port.
Username
string
required
SFTP user. Use a dedicated account scoped to the import directory.
Password
string
required
SFTP password. Stored encrypted; redacted in logs.
Base path
string
default:"/"
The root directory the connector lists from. Feed-level configurations are evaluated relative to this path.
The settings panel shows a Connection Status indicator. When the SFTP connection is broken, it surfaces the reason returned by the server — so you can tell whether credentials are missing, the host is unreachable, or auth was rejected. Re-test after rotating credentials.

Channels

Unlike connectors with source-side segmentation (Shopware sales channels, Storyblok spaces), CSV files aren’t natively segmented. The CSV Import’s channels exist purely to declare which project translations each CSV row should be resolved into. Each row gets the same payload across every translation on the channel — see “Same row across all locales” under Good to know for the implication. Per channel, the connector stores:
Channel Name
string
required
A label for the channel in Frontic.
Available Translations
multiselect
required
Project locale keys the CSV’s rows resolve into. Every row gets the same payload across all of these.
Fallback Translation
select
required
The translation used as fallback for missing values.
For multi-locale data via CSV with locale-specific copy, see “Same row across all locales” below for the workaround.

Data Feeds

The CSV Import exposes a single resource type — Content (one record per row). Each Data Feed binds a filename pattern under your base path; you can have many feeds per integration if you have many file types (products, customers, categories, …). The standard Settings → Updates → Schema setup wizard applies — see Data Feeds in the overview. For CSV Import specifically:
  • Updates step — Polling and manual trigger are supported. The Ingest API isn’t used.
  • No feed Refresh. Each polling pass reprocesses every matching file. Use Cleanup files so processed files aren’t picked up next time, or version filenames so the regex matches only new ones.

Feed configuration

Start dir
string
Subdirectory under the base path where this feed’s files live. Empty = base path itself.
Delimiter
string
default:";"
Field delimiter. Allowed values: ,, ;, \t — anything else is rejected.
Filename
string
required
Regex pattern matched against filenames in the start dir. For example ^products_\d{8}\.csv$ matches products_20260426.csv.
Use lockfile
boolean
When on, before reading any file the connector checks for a lock file in the same folder. If a lock file exists, the run aborts — no files are processed. Useful for partner uploads that drop a .lock while writing.
Lockfile
string
(Sub-config of Use lockfile.) The exact filename to look for as the lock file. Defaults to empty (no check).
Sort files and folders
boolean
When on, files are processed in filename order. Useful when filenames carry timestamps and order matters.
Folder sort
string
default:"ASC"
(Sub-config of Sort files and folders.) Either ASC or DESC.
Use custom identifier
boolean
When on, the record’s Source ID is built from one or more named columns instead of an id column. Multi-column identifiers are joined with -.
Custom identifiers
string[]
(Sub-config of Use custom identifier.) Column names to use as the Source ID.
Cleanup files
boolean
default:"false"
Delete each file after a successful run.
Cleanup folders
boolean
default:"false"
(Sub-config of Cleanup files.) After cleaning files, also remove directories that ended up empty.

What the data looks like

One row, one record

The header row of the CSV becomes the field keys; each subsequent row becomes a record. Whitespace in headers is trimmed. The Source ID for a record is taken from the id column unless you’ve set Custom identifiers.
id;name;sku;price_eur;price_usd
1001;Cotton Tee;TEE-001;19.99;21.99
1002;Linen Shirt;SHI-002;49.99;54.99
The two rows above produce two records on the feed: 1001 and 1002. As CSV files come in, the feed’s schema fills in automatically from the column headers — no need to declare fields up front in the wizard’s Schema step.

Good to know

  • CSV only. No JSON, no XML, no Excel. If you need other formats, convert upstream or use the Custom Integration and the Ingest API.
  • Same row across all locales. A CSV row produces the same payload for every locale on the channel. The CSV Import doesn’t read column-name suffixes like name_de / name_en natively. To split by locale: ingest the fields raw and resolve translations in your Value Composer, or split your data into per-locale files and feeds and map them via the Data Sync.
  • No real-time. The connector polls. Webhook-style “the moment a file arrives” delivery isn’t supported — set the polling cadence for your acceptable lag.
  • Every run re-reads every matching file. There’s no file-level “I’ve seen this before” check — if a file stays in the directory, it’s parsed again on the next run. Rows that haven’t materially changed still short-circuit at the storage layer (change detection), so downstream impact is contained — but each row counts as an API Update at intake. Use Cleanup files to delete processed files, or version filenames so the regex picks up only the new ones.

Custom Integration

For source systems with their own API — push records directly via the Ingest API.

Ingest API

The HTTP push path when SFTP polling isn’t the right fit.