Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
fc41ab9
[MLOP-634] Butterfree dev workflow, set triggers for branches staging…
moromimay Feb 8, 2021
4be4ffe
[BUG] Fix Staging GithubActions Pipeline (#283)
moromimay Feb 8, 2021
a3a601b
Apply only wheel. (#285)
moromimay Feb 8, 2021
4339608
[BUG] Change version on setup.py to PyPI (#286)
moromimay Feb 9, 2021
a82433c
Keep milliseconds when using 'from_ms' argument in timestamp feature …
hmeretti Feb 9, 2021
dcbf540
Change trigger for pipeline staging (#287)
moromimay Feb 10, 2021
a0a9335
Create a dev package. (#288)
moromimay Feb 10, 2021
7427898
[MLOP-633] Butterfree dev workflow, update documentation (#281)
moromimay Feb 10, 2021
245eaa5
[MLOP-632] Butterfree dev workflow, automate release description (#279)
AlvaroMarquesAndrade Feb 11, 2021
d6ecfa4
[MLOP-636] Create migration classes (#282)
AlvaroMarquesAndrade Feb 18, 2021
32e24d6
[MLOP-635] Rebase Incremental Job/Interval Run branch for test on sel…
moromimay Feb 19, 2021
8da89ed
Allow slide selection (#293)
roelschr Feb 22, 2021
0df07ae
Fix Slide Duration Typo (#295)
AlvaroMarquesAndrade Feb 26, 2021
aeb7999
[MLOP-637] Implement diff method (#292)
moromimay Mar 8, 2021
9afc39c
[MLOP-640] Create CLI with migrate command (#298)
roelschr Mar 15, 2021
bf204f2
[MLOP-645] Implement query method, cassandra (#291)
AlvaroMarquesAndrade Mar 15, 2021
b518dbc
[MLOP-671] Implement get_schema on Spark client (#301)
AlvaroMarquesAndrade Mar 16, 2021
5fe4c40
[MLOP-648] Implement query method, metastore (#294)
AlvaroMarquesAndrade Mar 16, 2021
e8fc0da
Fix Validation Step (#302)
AlvaroMarquesAndrade Mar 22, 2021
3d93a09
[MLOP-647] [MLOP-646] Apply migrations (#300)
roelschr Mar 23, 2021
0d30932
[BUG] Apply create_partitions to historical validate (#303)
moromimay Mar 30, 2021
d607297
[BUG] Fix key path for validate read (#304)
moromimay Mar 30, 2021
3dcd975
[FIX] Add Partition types for Metastore (#305)
AlvaroMarquesAndrade Apr 1, 2021
8077d86
[MLOP-639] Track logs in S3 (#306)
moromimay Apr 1, 2021
6d2a8f9
[BUG] Change logging config (#307)
moromimay Apr 6, 2021
d2c5d39
Change solution for tracking logs (#308)
moromimay Apr 8, 2021
43392f4
Read and write consistency level options (#309)
github-felipe-caputo Apr 13, 2021
0f31164
Fix kafka reader. (#310)
moromimay Apr 14, 2021
e6f67e9
Fix path validate. (#311)
moromimay Apr 14, 2021
baa594b
Add local dc property (#312)
github-felipe-caputo Apr 16, 2021
a74f098
Remove metastore migrate (#313)
moromimay Apr 20, 2021
378f3a5
Fix link in our docs. (#315)
moromimay Apr 20, 2021
3b18b5a
[BUG] Fix Cassandra Connect Session (#316)
moromimay Apr 23, 2021
c46f171
Fix migration query. (#318)
moromimay Apr 26, 2021
bb124f5
Fix migration query add type key. (#319)
moromimay Apr 28, 2021
1c97316
Fix db-config condition (#321)
moromimay May 5, 2021
bb7ed77
MLOP-642 Document migration in Butterfree (#320)
roelschr May 7, 2021
5a0a622
[MLOP-702] Debug mode for Automate Migration (#322)
moromimay May 10, 2021
b1371f1
[MLOP-727] Improve logging messages (#325)
GaBrandao Jun 2, 2021
acf7022
[MLOP-728] Improve logging messages (#324)
moromimay Jun 2, 2021
d0bf61a
Fix method to generate agg feature name. (#326)
moromimay Jun 4, 2021
1cf0dbd
[MLOP-691] Include step to add partition to SparkMetastore during wr…
moromimay Jun 10, 2021
9f42f53
Add the missing link for H3 geohash (#330)
jdvala Jun 16, 2021
78927e3
Update README.md (#331)
Jul 30, 2021
43bb3a3
Update Github Actions Workflow runner (#332)
Aug 22, 2022
2593839
Delete sphinx version. (#334)
moromimay Dec 20, 2022
35bcd30
Update files to staging (#336)
moromimay Dec 21, 2022
3a73ed8
Revert "Update files to staging (#336)" (#337)
moromimay Jan 2, 2023
6b78a50
Less strict requirements (#333)
lecardozo Aug 16, 2023
2a19009
feat: optional row count validation (#340)
ralphrass Aug 18, 2023
ca1a16d
fix: parameter, libs (#341)
ralphrass Aug 18, 2023
60c7ee4
pre-release 1.2.2.dev0 (#342)
ralphrass Aug 21, 2023
f35d665
Rebase staging (#343)
ralphrass Aug 21, 2023
2fab2a5
Merge branch 'staging' into rebase-staging-from-master
ralphrass Aug 21, 2023
46ce363
fix: methods
ralphrass Aug 21, 2023
a042a8f
fix: duplicate
ralphrass Aug 21, 2023
7d614fe
fix: version
ralphrass Aug 21, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,14 @@ Preferably use **Added**, **Changed**, **Removed** and **Fixed** topics in each

## [Unreleased]

## [1.2.2](https://github.com/quintoandar/butterfree/releases/tag/1.2.2)

### Changed
* Optional row count validation ([#284](https://github.com/quintoandar/butterfree/pull/284))
* Bump several libs versions ([#333](https://github.com/quintoandar/butterfree/pull/333))

## [1.2.1](https://github.com/quintoandar/butterfree/releases/tag/1.2.1)

### Changed
* Update README.md ([#331](https://github.com/quintoandar/butterfree/pull/331))
* Update Github Actions Workflow runner ([#332](https://github.com/quintoandar/butterfree/pull/332))
Expand All @@ -16,6 +22,7 @@ Preferably use **Added**, **Changed**, **Removed** and **Fixed** topics in each
* Add the missing link for H3 geohash ([#330](https://github.com/quintoandar/butterfree/pull/330))

## [1.2.0](https://github.com/quintoandar/butterfree/releases/tag/1.2.0)

### Added
* [MLOP-636] Create migration classes ([#282](https://github.com/quintoandar/butterfree/pull/282))
* [MLOP-635] Rebase Incremental Job/Interval Run branch for test on selected feature sets ([#278](https://github.com/quintoandar/butterfree/pull/278))
Expand Down
6 changes: 3 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@ VERSION := $(shell grep __version__ setup.py | head -1 | cut -d \" -f2 | cut -d
.PHONY: environment
## create virtual environment for butterfree
environment:
@pyenv install -s 3.7.6
@pyenv virtualenv 3.7.6 butterfree
@pyenv install -s 3.7.13
@pyenv virtualenv 3.7.13 butterfree
@pyenv local butterfree
@PYTHONPATH=. python -m pip install --upgrade pip

Expand Down Expand Up @@ -221,4 +221,4 @@ help:
} \
printf "\n"; \
}' \
| more $(shell test $(shell uname) = Darwin && echo '--no-init --raw-control-chars')
| more $(shell test $(shell uname) = Darwin && echo '--no-init --raw-control-chars')
2 changes: 1 addition & 1 deletion butterfree/configs/db/cassandra_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -246,7 +246,7 @@ def translate(self, schema: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
cassandra_schema.append(
{
"column_name": features["column_name"],
"type": cassandra_mapping[str(features["type"])],
"type": cassandra_mapping[str(features["type"]).replace("()", "")],
"primary_key": features["primary_key"],
}
)
Expand Down
17 changes: 9 additions & 8 deletions butterfree/load/sink.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,14 +69,15 @@ def validate(
"""
failures = []
for writer in self.writers:
try:
writer.validate(
feature_set=feature_set,
dataframe=dataframe,
spark_client=spark_client,
)
except AssertionError as e:
failures.append(e)
if writer.row_count_validation:
try:
writer.validate(
feature_set=feature_set,
dataframe=dataframe,
spark_client=spark_client,
)
except AssertionError as e:
failures.append(e)

if failures:
raise RuntimeError(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -113,9 +113,14 @@ def __init__(
debug_mode: bool = False,
interval_mode: bool = False,
check_schema_hook: Hook = None,
row_count_validation: bool = True,
):
super(HistoricalFeatureStoreWriter, self).__init__(
db_config or MetastoreConfig(), debug_mode, interval_mode
db_config or MetastoreConfig(),
debug_mode,
interval_mode,
False,
row_count_validation,
)
self.database = database or environment.get_variable(
"FEATURE_STORE_HISTORICAL_DATABASE"
Expand Down
2 changes: 2 additions & 0 deletions butterfree/load/writers/writer.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,15 @@ def __init__(
debug_mode: bool = False,
interval_mode: bool = False,
write_to_entity: bool = False,
row_count_validation: bool = True,
) -> None:
super().__init__()
self.db_config = db_config
self.transformations: List[Dict[str, Any]] = []
self.debug_mode = debug_mode
self.interval_mode = interval_mode
self.write_to_entity = write_to_entity
self.row_count_validation = row_count_validation

def with_(
self, transformer: Callable[..., DataFrame], *args: Any, **kwargs: Any
Expand Down
4 changes: 2 additions & 2 deletions butterfree/reports/metadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ def to_json(self) -> Any:
"features": [
{
"column_name": c["column_name"],
"data_type": str(c["type"]),
"data_type": str(c["type"]).replace("()", ""),
"description": desc,
}
for c, desc in params._features
Expand Down Expand Up @@ -208,7 +208,7 @@ def to_markdown(self) -> Any:

features = ["Column name", "Data type", "Description"]
for c, desc in params._features:
features.extend([c["column_name"], str(c["type"]), desc])
features.extend([c["column_name"], str(c["type"]).replace("()", ""), desc])

count_rows = len(features) // 3

Expand Down
10 changes: 5 additions & 5 deletions requirements.dev.txt
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
cmake==3.18.4
h3==3.7.0
pyarrow==0.15.1
h3==3.7.4
jupyter==1.0.0
twine==3.1.1
mypy==0.790
pyspark-stubs==3.0.0
sphinx==3.5.4
sphinxemoji==0.1.8
sphinx-rtd-theme==0.5.2
recommonmark==0.7.1
recommonmark==0.7.1
pyarrow>=1.0.0
setuptools
wheel
7 changes: 3 additions & 4 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
cassandra-driver>=3.22.0,<4.0
mdutils>=1.2.2,<2.0
pandas>=0.24,<1.1
pandas>=0.24,<2.0
parameters-validation>=1.1.5,<2.0
pyspark==3.*
typer>=0.3,<0.4
setuptools>=41,<42
typing-extensions==3.7.4.3
boto3==1.17.*
typing-extensions>3.7.4,<5
boto3==1.17.*
1 change: 1 addition & 0 deletions setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ spark_options =
spark.sql.session.timeZone: UTC
spark.driver.bindAddress: 127.0.0.1
spark.sql.legacy.timeParserPolicy: LEGACY
spark.sql.legacy.createHiveTableByDefault: false

[mypy]
# suppress errors about unsatisfied imports
Expand Down
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from setuptools import find_packages, setup

__package_name__ = "butterfree"
__version__ = "1.2.1"
__version__ = "1.2.2"
__repository_url__ = "https://github.com/quintoandar/butterfree"

with open("requirements.txt") as f:
Expand Down Expand Up @@ -34,7 +34,7 @@
license="Copyright",
author="QuintoAndar",
install_requires=requirements,
extras_require={"h3": ["cmake==3.16.3", "h3==3.4.2"]},
extras_require={"h3": ["h3>=3.7.4,<4"]},
python_requires=">=3.7, <4",
entry_points={"console_scripts": ["butterfree=butterfree._cli.main:app"]},
include_package_data=True,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -77,9 +77,11 @@ def test_feature_set_pipeline(
self, mocked_df, spark_session, fixed_windows_output_feature_set_dataframe,
):
# arrange

table_reader_id = "a_source"
table_reader_table = "table"
table_reader_db = environment.get_variable("FEATURE_STORE_HISTORICAL_DATABASE")

create_temp_view(dataframe=mocked_df, name=table_reader_id)
create_db_and_table(
spark=spark_session,
Expand All @@ -88,14 +90,16 @@ def test_feature_set_pipeline(
table_reader_table=table_reader_table,
)

dbconfig = Mock()
dbconfig.mode = "overwrite"
dbconfig.format_ = "parquet"
path = "test_folder/historical/entity/feature_set"

dbconfig = MetastoreConfig()
dbconfig.get_options = Mock(
return_value={"path": "test_folder/historical/entity/feature_set"}
return_value={"mode": "overwrite", "format_": "parquet", "path": path}
)

historical_writer = HistoricalFeatureStoreWriter(db_config=dbconfig)
historical_writer = HistoricalFeatureStoreWriter(
db_config=dbconfig, debug_mode=True
)

# act
test_pipeline = FeatureSetPipeline(
Expand Down Expand Up @@ -151,9 +155,13 @@ def test_feature_set_pipeline(
)
test_pipeline.run()

# act and assert
dbconfig.get_path_with_partitions = Mock(
return_value=["historical/entity/feature_set"]
)

# assert
path = dbconfig.get_options("historical/entity/feature_set").get("path")
df = spark_session.read.parquet(path).orderBy(TIMESTAMP_COLUMN)
df = spark_session.sql("select * from historical_feature_store__feature_set")

target_df = fixed_windows_output_feature_set_dataframe.orderBy(
test_pipeline.feature_set.timestamp_column
Expand All @@ -162,9 +170,6 @@ def test_feature_set_pipeline(
# assert
assert_dataframe_equality(df, target_df)

# tear down
shutil.rmtree("test_folder")

def test_feature_set_pipeline_with_dates(
self,
mocked_date_df,
Expand Down