Hi,
I have a Google Colab notebook with some functions (Python) that been used to calculate the features for a model.
The functions use as inputs data from an API.
The question is if I can or should calculate the features inside a Features Store and feed the results to the Model?
Or in which Instance do I need to make the calculations and then feed the results into the model?
It is possible to use Vertex AI Features Store to Fetch the data, so you can use it as a part of the Vertex AI Workflow to train Custom or AutoML models in Vertex.
You can see here[1] the Vertex AI workflow.
[1]https://cloud.google.com/vertex-ai/docs/beginner/beginners-guide#workflow
Hi, thanks for your comments
After reading the documents, I understand that the data that should be
stored in the Feature Store is “static data”.
By static I mean data from previously loaded databases and not calculated
within the Feature Store.
For example I wanted to add a simple average of the last 30 data entries
obtained from an API that sends data in real time in 5 minute intervals I
should::
Connect the API to the feature Store, store each data entry from the API
and then calculate the average inside the Feature Store?
Or should I connect the API to Goolge BiGQuery, store the data in Google
BigQuery, calculate the average and then send the data to the model deploy
in the end point?
Or connect a google colab notebook to the API, perform the calculations,
upload the Notebook to a container and send the data to the endpoint in
which the model was deployed?
Ance again thanks for your help