-
Notifications
You must be signed in to change notification settings - Fork 839
KEP-2170: Create model and dataset initializers #2303
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 3 commits
fe481b7
59d3224
45c860c
468500d
5c39812
2e8a518
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,13 @@ | ||
| FROM python:3.11-alpine | ||
|
|
||
| WORKDIR /workspace | ||
|
|
||
| # Copy the required Python modules. | ||
| COPY cmd/initiailizer_v2/dataset/requirements.txt . | ||
| COPY sdk/python/kubeflow sdk/python/kubeflow | ||
| COPY pkg/initiailizer_v2 pkg/initiailizer_v2 | ||
|
|
||
| # Install the needed packages. | ||
| RUN pip install -r requirements.txt | ||
|
|
||
| ENTRYPOINT ["python", "-m", "pkg.initiailizer_v2.dataset"] |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| huggingface_hub==0.23.4 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,13 @@ | ||
| FROM python:3.11-alpine | ||
|
|
||
| WORKDIR /workspace | ||
|
|
||
| # Copy the required Python modules. | ||
| COPY cmd/initiailizer_v2/model/requirements.txt . | ||
|
||
| COPY sdk/python/kubeflow sdk/python/kubeflow | ||
| COPY pkg/initiailizer_v2 pkg/initiailizer_v2 | ||
|
|
||
| # Install the needed packages. | ||
| RUN pip install -r requirements.txt | ||
|
|
||
| ENTRYPOINT ["python", "-m", "pkg.initiailizer_v2.model"] | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| huggingface_hub==0.23.4 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,31 @@ | ||
| import logging | ||
| import os | ||
| from urllib.parse import urlparse | ||
|
|
||
| import pkg.initiailizer_v2.utils.utils as utils | ||
| from pkg.initiailizer_v2.dataset.huggingface import HuggingFace | ||
|
|
||
| logging.basicConfig( | ||
| format="%(asctime)s %(levelname)-8s [%(filename)s:%(lineno)d] %(message)s", | ||
| datefmt="%Y-%m-%dT%H:%M:%SZ", | ||
| level=logging.INFO, | ||
| ) | ||
|
|
||
| if __name__ == "__main__": | ||
| logging.info("Starting dataset initialization") | ||
|
|
||
| try: | ||
| storage_uri = os.environ[utils.STORAGE_URI_ENV] | ||
| except Exception as e: | ||
| logging.error("STORAGE_URI env variable must be set.") | ||
| raise e | ||
|
|
||
| match urlparse(storage_uri).scheme: | ||
| # TODO (andreyvelich): Implement more dataset providers. | ||
| case utils.HF_SCHEME: | ||
| hf = HuggingFace() | ||
| hf.load_config() | ||
| hf.download_dataset() | ||
| case _: | ||
| logging.error("STORAGE_URI must have the valid dataset provider") | ||
| raise Exception |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
| from dataclasses import dataclass | ||
| from typing import Optional | ||
|
|
||
|
|
||
| # TODO (andreyvelich): This should be moved under Training V2 SDK. | ||
| @dataclass | ||
| class HuggingFaceDatasetConfig: | ||
| storage_uri: str | ||
| access_token: Optional[str] = None |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,42 @@ | ||
| import logging | ||
| from urllib.parse import urlparse | ||
|
|
||
| import huggingface_hub | ||
|
|
||
| import pkg.initiailizer_v2.utils.utils as utils | ||
|
|
||
| # TODO (andreyvelich): This should be moved to SDK V2 constants. | ||
| import sdk.python.kubeflow.storage_initializer.constants as constants | ||
| from pkg.initiailizer_v2.dataset.config import HuggingFaceDatasetConfig | ||
|
|
||
| logging.basicConfig( | ||
| format="%(asctime)s %(levelname)-8s [%(filename)s:%(lineno)d] %(message)s", | ||
| datefmt="%Y-%m-%dT%H:%M:%SZ", | ||
| level=logging.INFO, | ||
| ) | ||
|
|
||
|
|
||
| class HuggingFace(utils.DatasetProvider): | ||
|
|
||
| def load_config(self): | ||
| config_dict = utils.get_config_from_env(HuggingFaceDatasetConfig) | ||
| logging.info(f"Config for HuggingFace dataset initiailizer: {config_dict}") | ||
| self.config = HuggingFaceDatasetConfig(**config_dict) | ||
|
|
||
| def download_dataset(self): | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do we have unit tests for these, or would it be taken care in e2e? Downloading models from HF was having issues previously, it would be helpful to have it tested.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Nvmd, just realised we have: #2305 |
||
| storage_uri_parsed = urlparse(self.config.storage_uri) | ||
| dataset_uri = storage_uri_parsed.netloc + storage_uri_parsed.path | ||
|
|
||
| logging.info(f"Downloading dataset: {dataset_uri}") | ||
| logging.info("-" * 40) | ||
|
|
||
| if self.config.access_token: | ||
| huggingface_hub.login(self.config.access_token) | ||
|
|
||
| huggingface_hub.snapshot_download( | ||
| repo_id=dataset_uri, | ||
| repo_type="dataset", | ||
| local_dir=constants.VOLUME_PATH_DATASET, | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. To speed up things should we set max_workers equal to number of files getting downloaded, currently it downloads 8 files parallely?
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do we have any benchmarks that show that setting of max_workers to number of files speedup download time ? |
||
| ) | ||
|
|
||
| logging.info("Dataset has been downloaded") | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,33 @@ | ||
| import logging | ||
| import os | ||
| from urllib.parse import urlparse | ||
|
|
||
| import pkg.initiailizer_v2.utils.utils as utils | ||
| from pkg.initiailizer_v2.model.huggingface import HuggingFace | ||
|
|
||
| logging.basicConfig( | ||
| format="%(asctime)s %(levelname)-8s [%(filename)s:%(lineno)d] %(message)s", | ||
| datefmt="%Y-%m-%dT%H:%M:%SZ", | ||
| level=logging.INFO, | ||
| ) | ||
|
|
||
| if __name__ == "__main__": | ||
| logging.info("Starting pre-trained model initialization") | ||
|
|
||
| try: | ||
| storage_uri = os.environ[utils.STORAGE_URI_ENV] | ||
| except Exception as e: | ||
| logging.error("STORAGE_URI env variable must be set.") | ||
| raise e | ||
|
|
||
| match urlparse(storage_uri).scheme: | ||
| # TODO (andreyvelich): Implement more model providers. | ||
| case utils.HF_SCHEME: | ||
| hf = HuggingFace() | ||
| hf.load_config() | ||
| hf.download_model() | ||
| case _: | ||
| logging.error( | ||
| f"STORAGE_URI must have the valid model provider. STORAGE_URI: {storage_uri}" | ||
| ) | ||
| raise Exception |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
| from dataclasses import dataclass | ||
| from typing import Optional | ||
|
|
||
|
|
||
| # TODO (andreyvelich): This should be moved under Training V2 SDK. | ||
| @dataclass | ||
| class HuggingFaceModelInputConfig: | ||
| storage_uri: str | ||
| access_token: Optional[str] = None |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,44 @@ | ||
| import logging | ||
| from urllib.parse import urlparse | ||
|
|
||
| import huggingface_hub | ||
|
|
||
| import pkg.initiailizer_v2.utils.utils as utils | ||
|
|
||
| # TODO (andreyvelich): This should be moved to SDK V2 constants. | ||
| import sdk.python.kubeflow.storage_initializer.constants as constants | ||
| from pkg.initiailizer_v2.model.config import HuggingFaceModelInputConfig | ||
|
|
||
| logging.basicConfig( | ||
| format="%(asctime)s %(levelname)-8s [%(filename)s:%(lineno)d] %(message)s", | ||
| datefmt="%Y-%m-%dT%H:%M:%SZ", | ||
| level=logging.INFO, | ||
| ) | ||
|
|
||
|
|
||
| class HuggingFace(utils.ModelProvider): | ||
|
|
||
| def load_config(self): | ||
| config_dict = utils.get_config_from_env(HuggingFaceModelInputConfig) | ||
| logging.info(f"Config for HuggingFace model initiailizer: {config_dict}") | ||
| self.config = HuggingFaceModelInputConfig(**config_dict) | ||
|
|
||
| def download_model(self): | ||
| storage_uri_parsed = urlparse(self.config.storage_uri) | ||
| model_uri = storage_uri_parsed.netloc + storage_uri_parsed.path | ||
|
|
||
| logging.info(f"Downloading model: {model_uri}") | ||
| logging.info("-" * 40) | ||
|
|
||
| if self.config.access_token: | ||
| huggingface_hub.login(self.config.access_token) | ||
|
|
||
| # TODO (andreyvelich): We should verify these patterns for different models. | ||
| huggingface_hub.snapshot_download( | ||
| repo_id=model_uri, | ||
| local_dir=constants.VOLUME_PATH_MODEL, | ||
| allow_patterns=["*.json", "*.safetensors", "*.model"], | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. For snapshot_download(repo_id="mistralai/Mistral-7B-Instruct-v0.3", allow_patterns=["params.json", "consolidated.safetensors", "tokenizer.model.v3"], local_dir=mistral_models_path)In this case, downloading the above mistral model with the current allow_patterns, the downloaded size will be 29 GB (double from the actual size 14.5 GB). Probably you need some logic to handle the Mistral model.
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks for sharing! Let me add it to the TODO list. |
||
| ignore_patterns=["*.msgpack", "*.h5", "*.bin"], | ||
|
||
| ) | ||
|
|
||
| logging.info("Model has been downloaded") | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,37 @@ | ||
| import os | ||
| from abc import ABC, abstractmethod | ||
| from dataclasses import fields | ||
| from typing import Dict | ||
|
|
||
| STORAGE_URI_ENV = "STORAGE_URI" | ||
| HF_SCHEME = "hf" | ||
|
|
||
|
|
||
| class ModelProvider(ABC): | ||
| @abstractmethod | ||
| def load_config(self): | ||
| raise NotImplementedError() | ||
|
|
||
| @abstractmethod | ||
| def download_model(self): | ||
| raise NotImplementedError() | ||
|
|
||
|
|
||
| class DatasetProvider(ABC): | ||
| @abstractmethod | ||
| def load_config(self): | ||
| raise NotImplementedError() | ||
|
|
||
| @abstractmethod | ||
| def download_dataset(self): | ||
| raise NotImplementedError() | ||
|
|
||
|
|
||
| # Get DataClass config from the environment variables. | ||
| # Env names must be equal to the DataClass parameters. | ||
| def get_config_from_env(config) -> Dict[str, str]: | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. do you want some typing hints for config
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Actually, I wasn't able to find Python type that can represent DataClass, as described here: https://stackoverflow.com/questions/54668000/type-hint-for-an-instance-of-a-non-specific-dataclass#:~:text=Despite%20its%20name%2C%20dataclasses.dataclass%20doesn%27t%20expose%20a%20class%20interface..
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah I see. Then it’s fine to go without.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I’m not much of a Python dev these days so I don’t know about the typing for data classes.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. shall we use like this
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I tried this, it won't work. E.g. I can see this error from PyLance:
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is working @andreyvelich
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @deepanker13 What type validator do you use in your IDE ? |
||
| config_from_env = {} | ||
| for field in fields(config): | ||
| config_from_env[field.name] = os.getenv(field.name.upper()) | ||
|
|
||
| return config_from_env | ||
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a typo at multiple places
initializerThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great catch! Let me fix that.