@sherry-lamonica-oracle
Here is the situation. We need to have the multilingual model like intfloat/multilingual-e5-small. We follow the instruction
Convert Pretrained Models to ONNX Model: End-to-End Instructions for Text Embedding.
When we run below python script to export the model, we find that we cannot connect to huggingface.co. And finally, we find uggingface.co is blocked in China.
from oml.utils import ONNXPipeline, ONNXPipelineConfig
# Export to file
pipeline = ONNXPipeline(model_name="sentence-transformers/all-MiniLM-L6-v2")
pipeline.export2file("your_preconfig_file_name",output_dir=".")
We can download the model in onnx format in other way. My question is how to convert this downloaded model offline.
PS, we need to load the onnx model to 23ai. But we find that we failed to load the downloaded model directly to 23ai. It looks that 23ai does not support the variable size of multi-dimentions.