Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion CREATE_WORKLOAD_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ By default, workloads created will come with the following operations run in the

To invoke the newly created workload, run the following:
```
$ opensearch-benchmark execute_test \
$ opensearch-benchmark run-test \
--pipeline="benchmark-only" \
--workload-path="<PATH OUTPUTTED IN THE OUTPUT OF THE CREATE-WORKLOAD COMMAND>" \
--target-host="<CLUSTER ENDPOINT>" \
Expand Down
34 changes: 17 additions & 17 deletions DEVELOPER_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ This document will walk you through on what's needed to start contributing code
- [Setup](#setup)
- [Importing the project into an IDE](#importing-the-project-into-an-ide)
- [Setting Up a Local OpenSearch Cluster For OSB Development (Optional)](#setting-up-a-local-opensearch-cluster-for-osb-development-optional)
- [Executing tests](#executing-tests)
- [running tests](#running-tests)
- [Unit tests](#unit-tests)
- [Integration tests](#integration-tests)
- [Submitting your changes for a pull request](#submitting-your-changes-for-a-pull-request)
Expand All @@ -28,7 +28,7 @@ This document will walk you through on what's needed to start contributing code

`pyenv` requires that the C compiler and development libraries be installed, so that the specified Python versions can be build from source. The installation instructions vary from platform to platform.

For Debian-based systems, install the following modules to continue with the next steps:
For Debian-based systems, install the following modules to continue with the next steps:
```
sudo apt-get install -y make build-essential libssl-dev zlib1g-dev libbz2-dev \
libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev \
Expand Down Expand Up @@ -65,9 +65,9 @@ This document will walk you through on what's needed to start contributing code

### Setup

To develop OSB properly, it is recommended that you fork the official OpenSearch Benchmark repository.
To develop OSB properly, it is recommended that you fork the official OpenSearch Benchmark repository.

For those working on WSL2, it is recommended to clone the repository and set up the working environment within the Linux subsystem. Refer to the guide for setting up WSL2 on [Visual Studio Code](https://code.visualstudio.com/docs/remote/wsl) or [PyCharm](https://www.jetbrains.com/help/pycharm/using-wsl-as-a-remote-interpreter.html#create-wsl-interpreter).
For those working on WSL2, it is recommended to clone the repository and set up the working environment within the Linux subsystem. Refer to the guide for setting up WSL2 on [Visual Studio Code](https://code.visualstudio.com/docs/remote/wsl) or [PyCharm](https://www.jetbrains.com/help/pycharm/using-wsl-as-a-remote-interpreter.html#create-wsl-interpreter).

After you git cloned the forked copy of OpenSearch Benchmark, use the following command-line instructions to set up OpenSearch Benchmark for development:
```
Expand Down Expand Up @@ -98,9 +98,9 @@ In order to run tests within the PyCharm IDE, ensure the `Python Integrated Tool

## Setting Up a Local OpenSearch Cluster For OSB Development (Optional)

### OpenSearch Installation
### OpenSearch Installation

Download the latest release of OpenSearch from https://opensearch.org/downloads.html. If you are using WSL, make sure to download it into your `/home/<user>` directory instead of `/mnt/c`.
Download the latest release of OpenSearch from https://opensearch.org/downloads.html. If you are using WSL, make sure to download it into your `/home/<user>` directory instead of `/mnt/c`.
```
wget https://artifacts.opensearch.org/releases/bundle/opensearch/<x.x.x>/opensearch-<x.x.x>-linux-x64.tar.gz
tar -xf opensearch-x.x.x-linux-x64.tar.gz
Expand All @@ -110,17 +110,17 @@ NOTE: Have Docker running in the background for the next steps. Refer to the ins

### OpenSearch Cluster setup

Add the following settings to the `opensearch.yml` file under the config directory
Add the following settings to the `opensearch.yml` file under the config directory
```
vim config/opensearch.yml
```
```
#
discovery.type: single-node
plugins.security.disabled: true
discovery.type: single-node
plugins.security.disabled: true
#
```
Run the opensearch-tar-install.sh script to install and setup a cluster for our use.
Run the opensearch-tar-install.sh script to install and setup a cluster for our use.
```
bash opensearch-tar-install.sh
```
Expand All @@ -146,25 +146,25 @@ Check the output of `curl.exe "http://localhost:9200/_cluster/health?pretty"`. O
"active_shards_percent_as_number" : 100.0
}
```
Now, you have a local cluster running! You can connect to this and run the workload for the next step.
Now, you have a local cluster running! You can connect to this and run the workload for the next step.

### Running Workloads on a locally installed Cluster

Here's a sample executation of the geonames benchmark which can be found from the [workloads](https://github.com/opensearch-project/opensearch-benchmark-workloads) repo.
Here's a sample run of the geonames benchmark which can be found from the [workloads](https://github.com/opensearch-project/opensearch-benchmark-workloads) repo.
```
opensearch-benchmark execute-test --pipeline=benchmark-only --workload=geonames --target-host=127.0.0.1:9200 --test-mode --workload-params '{"number_of_shards":"1","number_of_replicas":"0"}'
opensearch-benchmark run-test --pipeline=benchmark-only --workload=geonames --target-host=127.0.0.1:9200 --test-mode --workload-params '{"number_of_shards":"1","number_of_replicas":"0"}'
```

And we're done! You should be seeing the performance metrics soon enough!

### Debugging
### Debugging

**If you are not seeing any results, it should be an indicator that there is an issue with your cluster setup or the way the manager is accessing it**. Use the command below to view the logs.
**If you are not seeing any results, it should be an indicator that there is an issue with your cluster setup or the way the manager is accessing it**. Use the command below to view the logs.
```
tail -f ~/.benchmark/logs/benchmark.log
```

## Executing tests
## running tests

Once setup is complete, you may run the unit and integration tests.

Expand Down Expand Up @@ -230,7 +230,7 @@ make install
To streamline the process, please refer to [this guide](https://github.com/opensearch-project/opensearch-benchmark/blob/main/PYTHON_SUPPORT_GUIDE.md)

### Debugging OpenSearch Benchmark in Developer Mode
Many users find that the simplest way to debug OpenSearch Benchmark is by using developer mode. Users can activate developer mode by running `python3 -m pip install -e .` within the cloned OpenSearch Benchmark repository. Any changes made and saved will be reflected when OpenSearch Benchmark is run. Users can add loggers or print statements and see the changes reflected in subsequent runs.
Many users find that the simplest way to debug OpenSearch Benchmark is by using developer mode. Users can activate developer mode by running `python3 -m pip install -e .` within the cloned OpenSearch Benchmark repository. Any changes made and saved will be reflected when OpenSearch Benchmark is run. Users can add loggers or print statements and see the changes reflected in subsequent runs.

### Debugging Unittests in Visual Studio Code
To run and debug unittests in Visual Studio Code, add the following configuration to the Python Debugger `launch.json` file. See [the official Visual Studio Code documentation](https://code.visualstudio.com/docs/editor/debugging) for more information on setting up and accessing `launch.json` file.
Expand Down
14 changes: 7 additions & 7 deletions PYTHON_SUPPORT_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,17 +27,17 @@ supported_python_versions = [(3, 8), (3, 9), (3, 10), (3, 11), (3, 12)]

**Basic OpenSearch Benchmark command with distribution version and test mode**
```
opensearch-benchmark execute-test --distribution-version=1.0.0 --workload=geonames --test-mode
opensearch-benchmark run-test --distribution-version=1.0.0 --workload=geonames --test-mode
```

**OpenSearch Benchmark command executing test on target-host in test mode**
**OpenSearch Benchmark command running test on target-host in test mode**
```
opensearch-benchmark execute-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'" --test-mode"
opensearch-benchmark run-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'" --test-mode"
```

**OpenSearch-Benchmark command executing test on target-host without test mode**
**OpenSearch-Benchmark command running test on target-host without test mode**
```
opensearch-benchmark execute-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'"
opensearch-benchmark run-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'"
```

To ensure that users are using the correct python versions, install the repository with `python3 -m pip install -e .` and run `which opensearch-benchmark` to get the path. Pre-append this path to each of the three commands above and re-run them in the command line.
Expand All @@ -46,12 +46,12 @@ Keep in mind the file path outputted differs for each operating system and might

- For example: When running `which opensearch-benchmark` on an Ubuntu environment, the commad line outputs `/home/ubuntu/.pyenv/shims/opensearch-benchmark`. On closer inspection, the path points to a shell script. Thus, to invoke OpenSearch Benchmark, pre-=append the OpenSearch Benchmark command with `bash` and the path outputted earlier:
```
bash -x /home/ubuntu/.pyenv/shims/opensearch-benchmark execute-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'"
bash -x /home/ubuntu/.pyenv/shims/opensearch-benchmark run-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'"
```

- Another example: When running `which opensearch-benchmark` on an Amazon Linux 2 environment, the command line outputs `~/.local/bin/opensearch-benchmark`. On closer inspection, the path points to a Python script. Thus, to invoke OpenSearch Benchmark, pre-append the OpenSearch Benchmark command with `python3` and the path outputted earlier:
```
python3 ~/.local/bin/opensearch-benchmark execute-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'"
python3 ~/.local/bin/opensearch-benchmark run-test --workload=geonames --pipeline=benchmark-only --target-host="<OPENSEARCH CLUSTER ENDPOINT>" --client-options="basic_auth_user:'<USERNAME>',basic_auth_password:'<PASSWORD>'"
```

### Creating a Pull Request After Adding Changes and Testing Them Out
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,4 +64,4 @@ Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
License for the specific language governing permissions and limitations under
the License.
the License.
4 changes: 2 additions & 2 deletions RELEASE_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Add backport labels to PRs and commits so that changes can be added to `main` br
* Since releases are generally published on Thursdays, maintainers should try to ensure all changes are merged in by Tuesday.
* A week prior to the scheduled release, maintainers should announce the fact in the [#performance channel](https://opensearch.slack.com/archives/C0516H8EJ7R) within the OpenSearch Slack community.
* Ensure that documentation is appropriately updated with respect to incoming changes prior to the release.

## Release the new version of OpenSearch Benchmark to PyPI, Docker, and ECR

1. Clone the official OpenSearch Benchmark git repository and change directory to it. This is where the following commands will be issued.
Expand Down Expand Up @@ -123,7 +123,7 @@ Send this message in the following channels in OpenSearch Community Slack:

If an error occurs during build process and you need to retrigger the workflow, do the following:

* Delete the tag locally: `git tag -d <VERSION>`
* Delete the tag locally: `git tag -d <VERSION>`
* Delete the tag on GitHub: `git push --delete origin <VERSION>`
* Delete the draft release on GitHub
* Create the tag again and push it to re-initiate the release process.
Expand Down
Loading
Loading