Skip to content

feat: InferenceSpec support for MMS and testing #4763

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 19 commits into from
Jul 10, 2024

Conversation

bryannahm1
Copy link
Contributor

@bryannahm1 bryannahm1 commented Jul 1, 2024

Issue #, if available:

Description of changes:
Added InferenceSpec support for MMS by adding inference.py file and code changes to testing script, and enabled support in local and endpoint mode.

Testing done:
Integr tests run locally

Merge Checklist

Put an x in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your pull request.

General

  • I have read the CONTRIBUTING doc
  • I certify that the changes I am introducing will be backward compatible, and I have discussed concerns about this, if any, with the Python SDK team
  • I used the commit message format described in CONTRIBUTING
  • I have passed the region in to all S3 and STS clients that I've initialized as part of this change.
  • I have updated any necessary documentation, including READMEs and API docs (if appropriate)

Tests

  • I have added tests that prove my fix is effective or that my feature works (if appropriate)
  • I have added unit and/or integration tests as appropriate to ensure backward compatibility of the changes
  • I have checked that my tests are not configured for a specific region or account (if appropriate)
  • I have used unique_name_from_base to create resource names in integ tests (if appropriate)

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

image_uri: str,
inference_spec: InferenceSpec = None,
) -> str:
"""This is a one-line summary of the function.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update docstring

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Has been updated, thank you

@@ -109,7 +119,7 @@ def _get_hf_metadata_create_model(self) -> Type[Model]:
"""

hf_model_md = get_huggingface_model_metadata(
self.model, self.env_vars.get("HUGGING_FACE_HUB_TOKEN")
self.env_vars.get("HF_MODEL_ID"), self.env_vars.get("HUGGING_FACE_HUB_TOKEN")
Copy link
Contributor

@grenmester grenmester Jul 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what happens if model is set with an HF model ID but the environment variable doesn't exist? does the environment variable get set based on self.model before this line is executed?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

)

self.pysdk_model = self._create_transformers_model()

if self.mode == Mode.LOCAL_CONTAINER:
self._prepare_for_mode()

logger.info("Model configuration %s", self.pysdk_model)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this logs a lot of info that customers may not necessarily need to see, what's the reason for adding this log line here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great comment, I will remove it.

@bryannahm1 bryannahm1 marked this pull request as ready for review July 2, 2024 23:50
@bryannahm1 bryannahm1 requested a review from a team as a code owner July 2, 2024 23:50
@bryannahm1 bryannahm1 requested a review from zhaoqizqwang July 2, 2024 23:50
@samruds samruds requested review from makungaj1, samruds and jiapinw and removed request for zhaoqizqwang July 2, 2024 23:53
@samruds samruds requested a review from gwang111 July 3, 2024 05:34
@@ -881,8 +881,8 @@ def _build_for_model_server(self): # pylint: disable=R0911, R1710
if self.model_metadata:
mlflow_path = self.model_metadata.get(MLFLOW_MODEL_PATH)

if not self.model and not mlflow_path:
raise ValueError("Missing required parameter `model` or 'ml_flow' path")
if not self.model and not mlflow_path and not self.inference_spec:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need this here? Since we already have a PR for it? #4769

Copy link
Collaborator

@samruds samruds Jul 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes these changes can be removed, since i've moved this to another PR and added UT coverage for it

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did the sync, thank you


class MultiModelServerPrepareTests(TestCase):
def test_start_invoke_destroy_local_multi_model_server(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need to re-run this locally to understand where it is failing

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it

if isinstance(obj[0], InferenceSpec):
inference_spec, schema_builder = obj

logger.info("in model_fn")
Copy link
Collaborator

@samruds samruds Jul 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This log statement can be removed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It has been removed


def predict_fn(input_data, predict_callable):
"""Placeholder docstring"""
logger.info("in predict_fn")
Copy link
Collaborator

@samruds samruds Jul 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This log statement can be removed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's now removed



def predict_fn(input_data, predict_callable):
"""Placeholder docstring"""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we update docstring here with what the method does?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated, thank you



def output_fn(predictions, accept_type):
"""Placeholder docstring"""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: Update doc string

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's now updated

Aditi2424
Aditi2424 previously approved these changes Jul 9, 2024
self.instance_type,
)
else:
raise ValueError("Cannot detect required model or inference spec")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add more details on how to fix this error. Like what parameter does the customer need to pass to fix this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it has just been updated. Thank you.

@Aditi2424 Aditi2424 merged commit 6789b61 into aws:master Jul 10, 2024
10 of 11 checks passed
@bryannahm1 bryannahm1 deleted the hf-inf-spec-pr branch July 10, 2024 20:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants