Skip to content

Support for LLaVA image-to-text task in OpenVINO exportΒ #2400

@ELkarousWissem

Description

@ELkarousWissem

System Info

Hello Optimum team,

I’m trying to export the LLaVA model `xtuner/llava-phi-3-mini-hf` to OpenVINO FP16 using:


optimum-cli export openvino --model xtuner/llava-phi-3-mini-hf llava-phi-3-mini-ov/FP16 --weight-format fp16

However, I get the following error:

ValueError: Asked to export a llava model for the task image-to-text (auto-detected),
but the Optimum OpenVINO exporter only supports the tasks image-text-to-text for llava.

Who can help?

@echarlaix @IlyasMoutawwakil

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

from pathlib import Path

import requests

utility_files = ["notebook_utils.py", "cmd_helper.py"]

for utility in utility_files:
local_path = Path(utility)
if not local_path.exists():
r = requests.get(
url=f"https://raw.githubusercontent.com/openvinotoolkit/openvino_notebooks/latest/utils/{local_path.name}",
)
with local_path.open("w") as f:
f.write(r.text)

model_id = "xtuner/llava-phi-3-mini-hf"
MODEL_DIR = Path(model_id.split("/")[-1].replace("-hf", "-ov"))

Read more about telemetry collection at https://github.com/openvinotoolkit/openvino_notebooks?tab=readme-ov-file#-telemetry

from notebook_utils import collect_telemetry

collect_telemetry("llava-next-multimodal-chatbot.ipynb")

from cmd_helper import optimum_cli

if not (MODEL_DIR / "FP16").exists():
optimum_cli(model_id, MODEL_DIR / "FP16", additional_args={"weight-format": "fp16"})

Expected behavior

CalledProcessError Traceback (most recent call last)
Cell In[2], line 4
1 from cmd_helper import optimum_cli
3 if not (MODEL_DIR / "FP16").exists():
----> 4 optimum_cli(model_id, MODEL_DIR / "FP16", additional_args={"weight-format": "fp16"})

File ~/cmd_helper.py:60, in optimum_cli(model_id, output_dir, show_command, additional_args, debug_logs)
58 if transofrmers_loglevel is not None:
59 os.environ["TRANSFORMERS_VERBOSITY"] = transofrmers_loglevel
---> 60 raise exc
61 finally:
62 if transofrmers_loglevel is not None:

File ~/cmd_helper.py:47, in optimum_cli(model_id, output_dir, show_command, additional_args, debug_logs)
44 os.environ["TRANSFORMERS_VERBOSITY"] = "debug"
46 try:
---> 47 subprocess.run(export_command.split(" "), shell=(platform.system() == "Windows"), check=True, capture_output=True)
48 except subprocess.CalledProcessError as exc:
49 logger = logging.getLogger()

File ~/miniconda3/lib/python3.12/subprocess.py:573, in run(input, capture_output, timeout, check, *popenargs, **kwargs)
571 retcode = process.poll()
572 if check and retcode:
--> 573 raise CalledProcessError(retcode, process.args,
574 output=stdout, stderr=stderr)
575 return CompletedProcess(process.args, retcode, stdout, stderr)

CalledProcessError: Command '['optimum-cli', 'export', 'openvino', '--model', 'xtuner/llava-phi-3-mini-hf', 'llava-phi-3-mini-ov/FP16', '--weight-format', 'fp16']' returned non-zero exit status 1.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions