Analyzing large amounts of data has become an everyday problem. Oftentimes, the datasets are only available as CSV file, which creates the the question of how you can import them into your Postgres database? The short answer: By using the Postgres’
COPY
-function. Here’s the long answer:Let’s imagine you have an
Ecto.Schema
calledLocation
with the following definition:schema "locations" do field :name, :string field :latitude, :float field :longitude, :float end
The
locations
table stores location data as latitude and longitude coordinates together with aname
. For example, this could be a street address with a house number and its geocodedlat+long
position.Now, let’s imagine you have a CSV file called
locations.csv
with 100.000 rows of location data and you want to import the data into yourlocations
table. In the following, we will use Postgres’COPY
-function for that. First, we will call it directly from apsql
-session and then we will wrap it into a simpleMix.Task
. Let’s go!
continue reading on peterullrich.com
⚠️ This post links to an external website. ⚠️
If this post was enjoyable or useful for you, please share it! If you have comments, questions, or feedback, you can email my personal email. To get new posts, subscribe use the RSS feed.