Home >Backend Development >Golang >How to Bulk Insert CSV Data into PostgreSQL Using Go, GORM, and the pgx Library Without Loops?
In this scenario, you have a CSV file with data you want to bulk insert into a PostgreSQL table using Go and the GORM ORM, without employing for loops or SQL raw queries.
The pgx library can be utilized for this task, as demonstrated in the following code snippet:
<code class="go">package main import ( "context" "database/sql" "fmt" "os" "github.com/jackc/pgx/v4/pgxpool" ) func main() { filename := "foo.csv" dbconn, err := pgxpool.Connect(context.Background(), os.Getenv("DATABASE_URL")) if err != nil { panic(err) } defer dbconn.Close() f, err := os.Open(filename) if err != nil { panic(err) } defer func() { _ = f.Close() }() res, err := dbconn.Conn().PgConn().CopyFrom(context.Background(), f, "COPY csv_test FROM STDIN (FORMAT csv)") if err != nil { panic(err) } fmt.Print(res.RowsAffected()) }</code>
In this code:
The above is the detailed content of How to Bulk Insert CSV Data into PostgreSQL Using Go, GORM, and the pgx Library Without Loops?. For more information, please follow other related articles on the PHP Chinese website!