Title: | Data Base Backend for 'mlr3' |
---|---|
Description: | Extends the 'mlr3' package with a backend to transparently work with databases such as 'SQLite', 'DuckDB', 'MySQL', 'MariaDB', or 'PostgreSQL'. The package provides two additional backends: 'DataBackendDplyr' relies on the abstraction of package 'dbplyr' to interact with most DBMS. 'DataBackendDuckDB' operates on 'DuckDB' data bases and also on Apache Parquet files. |
Authors: | Michel Lang [cre, aut] |
Maintainer: | Michel Lang <[email protected]> |
License: | LGPL-3 |
Version: | 0.5.1 |
Built: | 2024-10-05 04:56:43 UTC |
Source: | https://github.com/mlr-org/mlr3db |
Extends the 'mlr3' package with a backend to transparently work with databases such as 'SQLite', 'DuckDB', 'MySQL', 'MariaDB', or 'PostgreSQL'. The package provides two additional backends: 'DataBackendDplyr' relies on the abstraction of package 'dbplyr' to interact with most DBMS. 'DataBackendDuckDB' operates on 'DuckDB' data bases and also on Apache Parquet files.
mlr3db.sqlite_dir
: Default directory to store SQLite databases constructed
with as_sqlite_backend()
..
mlr3db.sqlite_dir
: Default directory to store DuckDB databases constructed
with as_duckdb_backend()
..
Maintainer: Michel Lang [email protected] (ORCID)
Useful links:
Report bugs at https://github.com/mlr-org/mlr3db/issues
Converts to a DataBackendDuckDB using the duckdb database, depending on the input type:
data.frame
: Creates a new DataBackendDataTable first using as_data_backend()
, then proceeds
with the conversion from DataBackendDataTable to DataBackendDuckDB.
mlr3::DataBackend: Creates a new DuckDB data base in the specified path. The filename is determined by the hash of the DataBackend. If the file already exists, a connection to the existing database is established and the existing files are reused.
The created backend automatically reconnects to the database if the connection was lost, e.g. because the object was serialized to the filesystem and restored in a different R session. The only requirement is that the path does not change and that the path is accessible on all workers.
as_duckdb_backend(data, path = getOption("mlr3db.duckdb_dir", ":temp:"), ...)
as_duckdb_backend(data, path = getOption("mlr3db.duckdb_dir", ":temp:"), ...)
data |
( |
path |
(
The default for this argument can be configured via option |
... |
( |
Converts to a DataBackendDplyr using a RSQLite database, depending on the input type:
data.frame
: Creates a new DataBackendDataTable first using as_data_backend()
, then proceeds
with the conversion from DataBackendDataTable to DataBackendDplyr.
mlr3::DataBackend: Creates a new SQLite data base in the specified path. The filename is determined by the hash of the DataBackend. If the file already exists, a connection to the existing database is established and the existing files are reused.
The created backend automatically reconnects to the database if the connection was lost, e.g. because the object was serialized to the filesystem and restored in a different R session. The only requirement is that the path does not change and that the path is accessible on all workers.
as_sqlite_backend(data, path = getOption("mlr3db.sqlite_dir", ":temp:"), ...)
as_sqlite_backend(data, path = getOption("mlr3db.sqlite_dir", ":temp:"), ...)
data |
( |
path |
(
The default for this argument can be configured via option |
... |
( |
DataBackendDplyr or Task.
A mlr3::DataBackend using dplyr::tbl()
from packages dplyr/dbplyr.
This includes tibbles
and abstract database connections interfaced by dbplyr.
The latter allows mlr3::Tasks to interface an out-of-memory database.
mlr3::DataBackend
-> DataBackendDplyr
levels
(named list()
)
List (named with column names) of factor levels as character()
.
Used to auto-convert character columns to factor variables.
connector
(function()
)
Function which is called to re-connect in case the connection became invalid.
rownames
(integer()
)
Returns vector of all distinct row identifiers, i.e. the contents of the primary key column.
colnames
(character()
)
Returns vector of all column names, including the primary key column.
nrow
(integer(1)
)
Number of rows (observations).
ncol
(integer(1)
)
Number of columns (variables), including the primary key column.
valid
(logical(1)
)
Returns NA
if the data does not inherits from "tbl_sql"
(i.e., it is not a real SQL data base).
Returns the result of DBI::dbIsValid()
otherwise.
new()
Creates a backend for a dplyr::tbl()
object.
DataBackendDplyr$new( data, primary_key, strings_as_factors = TRUE, connector = NULL )
data
(dplyr::tbl()
)
The data object.
Instead of calling the constructor yourself, you can call mlr3::as_data_backend()
on a dplyr::tbl()
.
Note that only objects of class "tbl_lazy"
will be converted to a DataBackendDplyr
(this includes all connectors from dbplyr).
Local "tbl"
objects such as tibbles
will converted to a
DataBackendDataTable.
primary_key
(character(1)
)
Name of the primary key column.
strings_as_factors
(logical(1)
|| character()
)
Either a character vector of column names to convert to factors, or a single logical flag:
if FALSE
, no column will be converted, if TRUE
all string columns (except the primary key).
For conversion, the backend is queried for distinct values of the respective columns
on construction and their levels are stored in $levels
.
connector
(function())\cr If not
NULL', a function which re-connects to the database in case the connection has become invalid.
Database connections can become invalid due to timeouts or if the backend is serialized
to the file system and then de-serialized again.
This round trip is often performed for parallelization, e.g. to send the objects to remote workers.
DBI::dbIsValid()
is called to validate the connection.
The function must return just the connection, not a dplyr::tbl()
object!
Note that this this function is serialized together with the backend, including
possible sensitive information such as login credentials.
These can be retrieved from the stored mlr3::DataBackend/mlr3::Task.
To protect your credentials, it is recommended to use the secret package.
finalize()
Finalizer which disconnects from the database. This is called during garbage collection of the instance.
DataBackendDplyr$finalize()
logical(1)
, the return value of DBI::dbDisconnect()
.
data()
Returns a slice of the data.
Calls dplyr::filter()
and dplyr::select()
on the table and converts it to a data.table::data.table()
.
The rows must be addressed as vector of primary key values, columns must be referred to via column names.
Queries for rows with no matching row id and queries for columns with no matching
column name are silently ignored.
Rows are guaranteed to be returned in the same order as rows
, columns may be returned in an arbitrary order.
Duplicated row ids result in duplicated rows, duplicated column names lead to an exception.
DataBackendDplyr$data(rows, cols, data_format = "data.table")
rows
integer()
Row indices.
cols
character()
Column names.
data_format
(character(1)
)
Desired data format, e.g. "data.table"
or "Matrix"
.
head()
Retrieve the first n
rows.
DataBackendDplyr$head(n = 6L)
n
(integer(1)
)
Number of rows.
data.table::data.table()
of the first n
rows.
distinct()
Returns a named list of vectors of distinct values for each column
specified. If na_rm
is TRUE
, missing values are removed from the
returned vectors of distinct values. Non-existing rows and columns are
silently ignored.
DataBackendDplyr$distinct(rows, cols, na_rm = TRUE)
rows
integer()
Row indices.
cols
character()
Column names.
na_rm
logical(1)
Whether to remove NAs or not.
Named list()
of distinct values.
missings()
Returns the number of missing values per column in the specified slice of data. Non-existing rows and columns are silently ignored.
DataBackendDplyr$missings(rows, cols)
rows
integer()
Row indices.
cols
character()
Column names.
Total of missing values per column (named numeric()
).
if (mlr3misc::require_namespaces(c("tibble", "RSQLite", "dbplyr"), quietly = TRUE)) { # Backend using a in-memory tibble data = tibble::as_tibble(iris) data$Sepal.Length[1:30] = NA data$row_id = 1:150 b = DataBackendDplyr$new(data, primary_key = "row_id") # Object supports all accessors of DataBackend print(b) b$nrow b$ncol b$colnames b$data(rows = 100:101, cols = "Species") b$distinct(b$rownames, "Species") # Classification task using this backend task = mlr3::TaskClassif$new(id = "iris_tibble", backend = b, target = "Species") print(task) head(task) # Create a temporary SQLite database con = DBI::dbConnect(RSQLite::SQLite(), ":memory:") dplyr::copy_to(con, data) tbl = dplyr::tbl(con, "data") # Define a backend on a subset of the database: do not use column "Sepal.Width" tbl = dplyr::select_at(tbl, setdiff(colnames(tbl), "Sepal.Width")) tbl = dplyr::filter(tbl, row_id %in% 1:120) # Use only first 120 rows b = DataBackendDplyr$new(tbl, primary_key = "row_id") print(b) # Query disinct values b$distinct(b$rownames, "Species") # Query number of missing values b$missings(b$rownames, b$colnames) # Note that SQLite does not support factors, column Species has been converted to character lapply(b$head(), class) # Cleanup rm(tbl) DBI::dbDisconnect(con) }
if (mlr3misc::require_namespaces(c("tibble", "RSQLite", "dbplyr"), quietly = TRUE)) { # Backend using a in-memory tibble data = tibble::as_tibble(iris) data$Sepal.Length[1:30] = NA data$row_id = 1:150 b = DataBackendDplyr$new(data, primary_key = "row_id") # Object supports all accessors of DataBackend print(b) b$nrow b$ncol b$colnames b$data(rows = 100:101, cols = "Species") b$distinct(b$rownames, "Species") # Classification task using this backend task = mlr3::TaskClassif$new(id = "iris_tibble", backend = b, target = "Species") print(task) head(task) # Create a temporary SQLite database con = DBI::dbConnect(RSQLite::SQLite(), ":memory:") dplyr::copy_to(con, data) tbl = dplyr::tbl(con, "data") # Define a backend on a subset of the database: do not use column "Sepal.Width" tbl = dplyr::select_at(tbl, setdiff(colnames(tbl), "Sepal.Width")) tbl = dplyr::filter(tbl, row_id %in% 1:120) # Use only first 120 rows b = DataBackendDplyr$new(tbl, primary_key = "row_id") print(b) # Query disinct values b$distinct(b$rownames, "Species") # Query number of missing values b$missings(b$rownames, b$colnames) # Note that SQLite does not support factors, column Species has been converted to character lapply(b$head(), class) # Cleanup rm(tbl) DBI::dbDisconnect(con) }
A mlr3::DataBackend for duckdb.
Can be easily constructed with as_duckdb_backend()
.
mlr3::DataBackend
-> DataBackendDuckDB
levels
(named list()
)
List (named with column names) of factor levels as character()
.
Used to auto-convert character columns to factor variables.
connector
(function()
)
Function which is called to re-connect in case the connection became invalid.
table
(character(1)
)
Data base table or view to operate on.
table_info
(data.frame()
)
Data frame as returned by pragma table_info()
.
rownames
(integer()
)
Returns vector of all distinct row identifiers, i.e. the contents of the primary key column.
colnames
(character()
)
Returns vector of all column names, including the primary key column.
nrow
(integer(1)
)
Number of rows (observations).
ncol
(integer(1)
)
Number of columns (variables), including the primary key column.
valid
(logical(1)
)
Returns NA
if the data does not inherits from "tbl_sql"
(i.e., it is not a real SQL data base).
Returns the result of DBI::dbIsValid()
otherwise.
new()
Creates a backend for a duckdb::duckdb()
database.
DataBackendDuckDB$new( data, table, primary_key, strings_as_factors = TRUE, connector = NULL )
data
(connection)
A connection created with DBI::dbConnect()
.
If constructed manually (and not via the helper function as_duckdb_backend()
,
make sure that there exists an (unique) index for the key column.
table
(character(1)
)
Table or view to operate on.
primary_key
(character(1)
)
Name of the primary key column.
strings_as_factors
(logical(1)
|| character()
)
Either a character vector of column names to convert to factors, or a single logical flag:
if FALSE
, no column will be converted, if TRUE
all string columns (except the primary key).
For conversion, the backend is queried for distinct values of the respective columns
on construction and their levels are stored in $levels
.
connector
(function())\cr If not
NULL', a function which re-connects to the database in case the connection has become invalid.
Database connections can become invalid due to timeouts or if the backend is serialized
to the file system and then de-serialized again.
This round trip is often performed for parallelization, e.g. to send the objects to remote workers.
DBI::dbIsValid()
is called to validate the connection.
The function must return just the connection, not a dplyr::tbl()
object!
Note that this this function is serialized together with the backend, including
possible sensitive information such as login credentials.
These can be retrieved from the stored mlr3::DataBackend/mlr3::Task.
To protect your credentials, it is recommended to use the secret package.
finalize()
Finalizer which disconnects from the database. This is called during garbage collection of the instance.
DataBackendDuckDB$finalize()
logical(1)
, the return value of DBI::dbDisconnect()
.
data()
Returns a slice of the data.
The rows must be addressed as vector of primary key values, columns must be referred to via column names.
Queries for rows with no matching row id and queries for columns with no matching
column name are silently ignored.
Rows are guaranteed to be returned in the same order as rows
, columns may be returned in an arbitrary order.
Duplicated row ids result in duplicated rows, duplicated column names lead to an exception.
DataBackendDuckDB$data(rows, cols, data_format = "data.table")
rows
integer()
Row indices.
cols
character()
Column names.
data_format
(character(1)
)
Desired data format, e.g. "data.table"
or "Matrix"
.
head()
Retrieve the first n
rows.
DataBackendDuckDB$head(n = 6L)
n
(integer(1)
)
Number of rows.
data.table::data.table()
of the first n
rows.
distinct()
Returns a named list of vectors of distinct values for each column
specified. If na_rm
is TRUE
, missing values are removed from the
returned vectors of distinct values. Non-existing rows and columns are
silently ignored.
DataBackendDuckDB$distinct(rows, cols, na_rm = TRUE)
rows
integer()
Row indices.
cols
character()
Column names.
na_rm
logical(1)
Whether to remove NAs or not.
Named list()
of distinct values.
missings()
Returns the number of missing values per column in the specified slice of data. Non-existing rows and columns are silently ignored.
DataBackendDuckDB$missings(rows, cols)
rows
integer()
Row indices.
cols
character()
Column names.
Total of missing values per column (named numeric()
).