Skip to content

Commit

Permalink
Update some docs to remove old backend-specific linkers
Browse files Browse the repository at this point in the history
  • Loading branch information
ADBond committed Jul 23, 2024
1 parent 0545ce7 commit e66600e
Show file tree
Hide file tree
Showing 5 changed files with 15 additions and 12 deletions.
6 changes: 4 additions & 2 deletions docs/api_docs/datasets.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,16 +17,18 @@ df = splink_datasets.fake_1000
which you can then use to set up a linker:
```py
from splink splink_datasets, Linker, DuckDBAPI, SettingsCreator

df = splink_datasets.fake_1000
linker = DuckDBLinker(
linker = Linker(
df,
SettingsCreator(
link_type="dedupe_only",
comparisons=[
cl.exact_match("first_name"),
cl.exact_match("surname"),
],
)
),
db_api=DuckDBAPI()
)
```

Expand Down
4 changes: 2 additions & 2 deletions docs/dev_guides/debug_modes.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ Note that by default Splink sets the [logging level to `INFO` on initialisation]

```python
import logging
linker = DuckDBLinker(df, settings, set_up_basic_logging=False)
linker = Linker(df, settings, db_api, set_up_basic_logging=False)

# This must come AFTER the linker is intialised, because the logging level
# will be set to INFO
Expand All @@ -61,5 +61,5 @@ logging.basicConfig(format="%(message)s")
splink_logger = logging.getLogger("splink")
splink_logger.setLevel(logging.INFO)

linker = DuckDBLinker(df, settings, set_up_basic_logging=False)
linker = Linker(df, settings, db_api, set_up_basic_logging=False)
```
2 changes: 1 addition & 1 deletion docs/topic_guides/data_preparation/feature_engineering.md
Original file line number Diff line number Diff line change
Expand Up @@ -194,7 +194,7 @@ For a more detailed explanation on phonetic transformation algorithms, see the [

### Example

There are a number of python packages which support phonetic transformations that can be applied to a pandas dataframe, which can then be loaded into the `DuckDBLinker`. For example, creating a [Double Metaphone](../comparisons/phonetic.md#double-metaphone) column with the [phonetics](https://pypi.org/project/phonetics/) python library:
There are a number of python packages which support phonetic transformations that can be applied to a pandas dataframe, which can then be loaded into the `Linker`. For example, creating a [Double Metaphone](../comparisons/phonetic.md#double-metaphone) column with the [phonetics](https://pypi.org/project/phonetics/) python library:

```python
import pandas as pd
Expand Down
12 changes: 6 additions & 6 deletions docs/topic_guides/performance/optimising_duckdb.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,17 +99,17 @@ Use the special `:temporary:` connection built into Splink that creates a tempor

```python

linker = DuckDBLinker(
df, settings, connection=":temporary:"
linker = Linker(
df, settings, DuckDBAPI(connection=":temporary:")
)
```

Use an on-disk database:

```python
con = duckdb.connect(database='my-db.duckdb')
linker = DuckDBLinker(
df, settings, connection=con
linker = Linker(
df, settings, DuckDBAPI(connection=con)
)
```

Expand All @@ -119,8 +119,8 @@ Use an in-memory database, but ensure it can spill to disk:
con = duckdb.connect(":memory:")

con.execute("SET temp_directory='/path/to/temp';")
linker = DuckDBLinker(
df, settings, connection=con
linker = Linker(
df, settings, DuckDBAPI(connection=con)
)
```

Expand Down
3 changes: 2 additions & 1 deletion docs/topic_guides/splink_fundamentals/settings.md
Original file line number Diff line number Diff line change
Expand Up @@ -416,7 +416,8 @@ where the `m_probability` and `u_probability` values here are then used to gener
When using a pre-trained model, you can read in the model from a json and recreate the linker object to make new pairwise predictions. For example:

```py
linker = DuckDBLinker(new_df,
linker = Linker(
new_df,
settings="./path/to/model.json",
db_api=db_api
)
Expand Down

0 comments on commit e66600e

Please sign in to comment.