Check: Building a Custom Colorimetric Pipeline (RAW to CIELAB) on Pixel TPU via TFLite

Hello Community and Google Engineers,

I am a member of the Google Developer Program working on a specialized computer vision project that requires absolute colorimetric accuracy (scientific measurement) rather than the standard “perceptually pleasing” computational photography provided by the stock ISP.

Based on my research, I understand that the standard ISP pipeline (Tone Mapping, AWB, Color Space Transforms) is proprietary and cannot be modified by external developers to support non-standard rendering intents.

Therefore, I am exploring a “Parallel Pipeline” architecture on Pixel devices (specifically Pixel 7/8/9 Pro) and need to confirm the technical feasibility of the following workflow before committing resources:

The Proposed Workflow:

* Capture: Acquire RAW_SENSOR data via the Camera2 API (aiming for linear, scene-referred data).

* Process: Pass this RAW data directly to a custom TensorFlow Lite (TFLite) model.

* Compute: The model performs a learned “spectral reconstruction” or “complex color space transformation” (mapping RAW RGB to absolute CIELAB).

* Hardware Acceleration: Execute this model on the Pixel TPU (using the NNAPI delegate or the new Google AI Edge SDK).

My Critical Questions (The “Blockers”):

* TPU Access for Regression: Does the Pixel TPU delegate currently support high-precision floating-point operations required for regression tasks (outputting precise coordinate values like Lab*), or is it strictly optimized for quantized classification/detection tasks?

* RAW Data Integrity: When capturing RAW_SENSOR on Pixel, does the firmware apply any irreversible “baking” (like local tone mapping or spatial gain maps) before the data reaches the API, which would render scientific colorimetry impossible?

* Throughput: Is it realistic for an external developer to achieve near real-time performance for high-resolution RAW processing on the TPU, or is the required bandwidth/memory access privileged to first-party Google services (like Magic Eraser/Real Tone)?

Context:

My goal is to treat the Pixel device as a colorimeter. If the hardware abstraction layer prevents raw linear access or restricts TPU usage for this type of custom signal processing, I will need to pivot my hardware strategy.

Any insights from the TensorFlow Lite or Pixel Camera engineering teams would be invaluable.

Thank you.

Hey,

Hope you’re keeping well.

On Pixel devices, the NNAPI delegate can target the TPU (Google Edge TPU) for certain model operations, but it’s primarily optimized for 8-bit quantized models. High-precision float32 regression workloads may fall back to CPU or GPU depending on operator support, so you’ll need to profile with nnapi enabled in TFLite to confirm execution paths. RAW_SENSOR capture via Camera2 API provides Bayer data after sensor-level corrections like per-pixel gain maps; while global tone mapping is not applied, some calibration steps are baked in and cannot be bypassed.

Thanks and regards,
Taz

1 Like

noted

1 Like

Hi Taz,

Thank you for sharing this professional knowledge.

Your clarification regarding the TPU’s optimization for 8-bit quantized models and the “baked-in” calibration steps in RAW_SENSOR was exactly the missing piece I needed. This solved a core technical question that had been bothering me for a long time.

I truly appreciate this expert-level response—it has helped me define the precise constraints of the hardware pipeline.

Best regards,

Jia

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.