Can I export a my model for internal usage ?

Hi,

I created my models with Auto ML (image classification or object detection).

Now, I would like to use these in my application, on local (disconnected).

Is it possible to extract a model file from Auto ML that I can use (.pb for instance) ?

After some researches, it seems to me that it is not possible but I would like to be sure.

Else, how?

Regards.

1 Like

I have found this documentation that enumerates the steps on exporting an image classification model in Tensorflow SavedModel for use on Docker container.

After exporting your model to a Google Cloud Storage bucket you can use your exported model to make predictions in a Docker image. You may refer to this documentation on how to deploy to a container.

2 Likes

I just didn’t understand that I need to select the “edge” option to have the “export model” available.

Thank you for the useful documentation .

As a additionnal question : can I know the version of Tensorflow used for these export model? (I have some incompatibiliy to use these in my software).

1 Like

As this is more of a Tensorflow question, I suggest for you to ask in their forum.

1 Like