libpqxx
7.0.5
|
Efficiently write data directly to a database table. More...
#include <stream_to.hxx>
Public Member Functions | |
stream_to (transaction_base &, std::string_view table_name) | |
Create a stream, without specifying columns. More... | |
template<typename Columns > | |
stream_to (transaction_base &, std::string_view table_name, Columns const &columns) | |
Create a stream, specifying column names as a container of strings. More... | |
template<typename Iter > | |
stream_to (transaction_base &, std::string_view table_name, Iter columns_begin, Iter columns_end) | |
Create a stream, specifying column names as a sequence of strings. More... | |
~stream_to () noexcept | |
operator bool () const noexcept | |
bool | operator! () const noexcept |
void | complete () |
Complete the operation, and check for errors. More... | |
template<typename Tuple > | |
stream_to & | operator<< (Tuple const &) |
Insert a row of data. More... | |
stream_to & | operator<< (stream_from &) |
Stream a stream_from straight into a stream_to . More... | |
Efficiently write data directly to a database table.
If you wish to insert rows of data into a table, you can compose INSERT statements and execute them. But it's slow and tedious, and you need to worry about quoting and escaping the data.
If you're just inserting a single row, it probably won't matter much. You can use prepared or parameterised statements to take care of the escaping for you. But if you're inserting large numbers of rows you will want something better.
Inserting rows one by one tends to take a lot of time, especially when you are working with a remote database server over the network. Every single row involves sending the data over the network, and waiting for a reply. Do it "in bulk" using stream_to
, and you may find that it goes many times faster, sometimes even by orders of magnitude.
Here's how it works: you create a stream_to
stream to start writing to your table. You will probably want to specify the columns. Then, you feed your data into the stream one row at a time. And finally, you call the stream's complete()
to tell it to finalise the operation, wait for completion, and check for errors.
You insert data using the <<
("shift-left") operator. Each row must be something that can be iterated in order to get its constituent fields: a std::tuple
, a std::vector
, or anything else with a begin
and end
. It could be a class of your own. Of course the fields have to match the columns you specified when creating the stream.
There is also a matching stream_from for reading data in bulk.
pqxx::stream_to::stream_to | ( | transaction_base & | tb, |
std::string_view | table_name | ||
) |
Create a stream, without specifying columns.
Fields will be inserted in whatever order the columns have in the database.
You'll probably want to specify the columns, so that the mapping between your data fields and the table is explicit in your code, and not hidden in an "implicit contract" between your code and your schema.
pqxx::stream_to::stream_to | ( | transaction_base & | tb, |
std::string_view | table_name, | ||
Columns const & | columns | ||
) |
Create a stream, specifying column names as a container of strings.
pqxx::stream_to::stream_to | ( | transaction_base & | tb, |
std::string_view | table_name, | ||
Iter | columns_begin, | ||
Iter | columns_end | ||
) |
Create a stream, specifying column names as a sequence of strings.
References pqxx::separated_list().
|
noexcept |
void pqxx::stream_to::complete | ( | ) |
Complete the operation, and check for errors.
Always call this to close the stream in an orderly fashion, even after an error. (In the case of an error, abort the transaction afterwards.)
The only circumstance where it's safe to skip this is after an error, if you're discarding the entire connection.
References pqxx::transaction_base::conn(), pqxx::internal::transactionfocus::m_trans, and pqxx::internal::transactionfocus::unregister_me().
Referenced by ~stream_to().
|
noexcept |
|
noexcept |
References pqxx::operator<<().
stream_to & pqxx::stream_to::operator<< | ( | Tuple const & | t | ) |
Insert a row of data.
The data can be any type that can be iterated. Each iterated item becomes a field in the row, in the same order as the columns you specified when creating the stream.
Each field will be converted into the database's format using pqxx::to_string
.
References pqxx::separated_list().
pqxx::stream_to & pqxx::stream_to::operator<< | ( | stream_from & | tr | ) |
Stream a stream_from
straight into a stream_to
.
This can be useful when copying between different databases. If the source and the destination are on the same database, you'll get better performance doing it all in a regular query.
References pqxx::stream_from::get_raw_line(), and pqxx::internal::transactionfocus::register_me().