-
Notifications
You must be signed in to change notification settings - Fork 1.2k
feat: Kickoff Transformation implementation #5130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice!! 🚀🚀🚀
def get_feature_transformation(self) -> Optional[Transformation]: | ||
if not self.udf: | ||
return None | ||
if self.mode == TransformationMode.pandas or self.mode == "pandas": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably can just do a dictionary mapping
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure will add that
udf_string: str = "", | ||
tags: Optional[Dict[str, str]] = None, | ||
description: str = "", | ||
owner: str = "", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably can add the singleton parameter too
|
||
|
||
class TransformationMode(Enum): | ||
PYTHON = "python" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice
self, | ||
*, | ||
name: str, | ||
mode: Union[TransformationMode, str], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice
class TransformationMode(Enum): | ||
PYTHON = "python" | ||
PANDAS = "pandas" | ||
spark = "spark" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spark = "spark" | |
SPARK = "spark" |
Do you think we can scope one type of batch transformation as our MVP? If so, which one would you feel most comfortable doing? |
yep, maybe Spark SQL. that should be easy to implement, and we can get it live soon. |
LFG! |
FYI @HaoXuAI a very selfish goal I have is to use a Spark batch transformation to handle scaling embedding documents. So in an ideal world we could do something like: def create_embeddings(partitionData):
tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/all-MiniLM-L6-v2")
model = AutoModel.from_pretrained("sentence-transformers/all-MiniLM-L6-v2")
for row in partitionData:
document = str(row.document)
inputs = tokenizer(document, padding=True, truncation=True, return_tensors="pt", max_length=512)
result = model(**inputs)
embeddings = result.last_hidden_state[:, 0, :].cpu().detach().numpy()
lst = embeddings.flatten().tolist()
yield [row.id, lst, "", "{}", None] And embeddings = dataset_df.rdd.mapPartitions(create_embeddings) Used in a batch feature view and an on demand feature view (on write) to support offline and online processing of docs for RAG. The key thing here is that we'd be able to really scale RAG offline embedding and make the transition seamless to online. This example is from the Pinecone docs here: https://docs.pinecone.io/integrations/databricks#3-create-the-vector-embeddings |
This seems to be a really good example of the BatchFeatureView + transformation |
e5cb88b
to
e925038
Compare
PR diverged, cherrypick at #5181 |
What this PR does / why we need it:
Created a Transformation interface. it still works with the current pandas_transformation, python_transformation etc.
The next step is refactor the BatchMaterializationEngine to make it works for both Materialization and Transformation.
Which issue(s) this PR fixes:
#4584
#4277 (comment)
#4696
Misc