Conversation
|
hi @JackHopkins How can I can contact you ...etc (discord invite is invalid) thanks |
|
Hey Kevin -
You can join here: https://discord.gg/ymmV6FKg
LMK if you can't and I can add you via email!
Daniel Kwak, CEO
Paperplane ( https://www.paperplane.ai/ )
404-704-2999
…On Tue, Nov 21, 2023 at 6:01 PM, Kevin Truong < ***@***.*** > wrote:
hi @ JackHopkins ( https://github.com/JackHopkins )
WIP for bedrock support
I'm testing at my side, working. But need align with you to make a good
progress for the MR.
How can I can contact you ...etc (discord invite is invalid)
thanks
—
Reply to this email directly, view it on GitHub (
#81 (comment)
) , or unsubscribe (
https://github.com/notifications/unsubscribe-auth/AHVPE5SGIJAPRSL24LSID7LYFUXDDAVCNFSM6AAAAAA7VKKPTOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRRHAZTEOBQGQ
).
You are receiving this because you are subscribed to this thread. Message
ID: <monkeypatch/monkeypatch. py/pull/81/c1821832804 @ github. com>
|
JackHopkins
left a comment
There was a problem hiding this comment.
Can you have a look at these comments and let me know what you think?
Additionally, it'd be really helpful to have some automated tests for Bedrock (possibly mocking out the Bedrock client) so we can ensure that we don't break anything as we continue development.
| if ("error" in response and | ||
| "code" in response["error"] and | ||
| response["error"]["code"] == 'invalid_api_key'): | ||
| raise Exception(f"The supplied OpenAI API key {self.api_key} is invalid") |
There was a problem hiding this comment.
Please change to "Anthropic API key"
| continue | ||
|
|
||
| if not choice: | ||
| raise Exception("OpenAI API failed to generate a response") |
| modelId=model, | ||
| contentType="application/json", | ||
| accept="application/json") | ||
| return json.loads(response.get('body').read().decode())['completion'] |
There was a problem hiding this comment.
We shouldn't be returning this here - we should loop the counter below in the case where the bedrock_runtime throws an exception.
| return 'anthropic.claude-instant-v1' | ||
|
|
||
| elif api_model_name == 'openai': | ||
| return "" |
There was a problem hiding this comment.
Let me double check to ensure whether this works.
| "Content-Type": "application/json", | ||
| } | ||
| response = requests.post( | ||
| OPENAI_URL, headers=openai_headers, json=params, timeout=50 |
There was a problem hiding this comment.
We should replace this with the invoke_model you have above.
| training_file_id = response["id"] | ||
| # submit the finetuning job | ||
| try: | ||
| finetuning_response = openai.FineTuningJob.create(training_file=training_file_id, model="gpt-3.5-turbo", |
There was a problem hiding this comment.
We need to figure out a way to call the logic to trigger a finetune in the relevant LLM provider class (i.e OpenAI, Bedrock, etc) - this is not the right place for this logic to live now we are adding support for more providers.
| from monkey_patch.language_models.llm_api_abc import LLM_Api | ||
| import os | ||
|
|
||
| OPENAI_URL = "https://api.openai.com/v1/chat/completions" |
| OPENAI_URL = "https://api.openai.com/v1/chat/completions" | ||
| import requests | ||
|
|
||
| AWS_REGION = os.environ.get("AWS_DEFAULT_REGION", "us-east-1") |
There was a problem hiding this comment.
We should instantiate all envvars / bedrock specific logic inside the init of the Bedrock class - catching and reraising any thrown exceptions.
| from monkey_patch.utils import approximate_token_count | ||
|
|
||
|
|
||
| class ApiModelFactory: |
There was a problem hiding this comment.
Can we please have this factory inside its own file.
src/monkey_patch/monkey.py
Outdated
| logging.basicConfig(level=ALIGN_LEVEL_NUM) | ||
| logger = logger_factory(__name__) | ||
| language_modeler = LanguageModel() | ||
| language_modeler = LanguageModel(api_model=os.getenv('API_MODEL')) |
There was a problem hiding this comment.
We should add the API_MODEL envvar to .example.env to make it obvious that it is required.
4422bae to
b411af8
Compare
No description provided.