Field Fast Precomputation
Example
From version 14.0 to version 15.0, in the product
module,
a new field value_count
appeared on the model product.template.attribute.line
.
This is a simple field that stores the amount of value_ids
associated with each element.
Source Code Differences
Version 14.0
class ProductTemplateAttributeLine(models.Model):
_name = "product.template.attribute.line"
value_ids = fields.Many2many(
comodel_name="product.attribute.value",
relation="product_attribute_value_product_template_attribute_line_rel"
)
See Full v14 Code Source.
Version 15.0
class ProductTemplateAttributeLine(models.Model):
_name = "product.template.attribute.line"
value_ids = fields.Many2many(
comodel_name="product.attribute.value",
relation="product_attribute_value_product_template_attribute_line_rel"
)
value_count = fields.Integer(
compute="_compute_value_count",
store=True,
)
@api.depends("value_ids")
def _compute_value_count(self):
for record in self:
record.value_count = len(record.value_ids)
See Full v15 Code Source.
Analysis
---Fields in module 'product'---
product / product.template.attribute.line / value_count (integer):
NEW isfunction: function, stored
Result without migration script / Expected Result
The new field will be computed, by the ORM. So in absolute terms, there is nothing to do. However, the calculation will be performed for all the rows of the table. If the table contains a very large amount of data, the calculation can be slow: Indeed, even if the ORM contains some prefetch system for reading the data, the writing will be done item by item, via an SQL request of the UPDATE type, which is not at all optimized, and will generate a (quasi) linear type of time complexity.
See : https://en.wikipedia.org/wiki/Time_complexity#Linear_time
As a result, some compute on huge database can take hours, or days, or worse, generate an insufficient memory error, which stops the migration.
These inconveniences obviously depend on the size of the table in question and the power and the configuration of the server that is performing the migration.
Contribution to OpenUpgrade
Update upgrade_analysis_work.txt
file
Add a comment after the line:
product / product.template.attribute.line / value_count (integer):
NEW isfunction: function, stored
# DONE: pre-migration: fast computed value_count
Write migration Script
in the pre-migration.py
script add:
from openupgradelib import openupgrade
def compute_product_template_attribute_line_value_count(env):
openupgrade.add_fields(env, [(
"value_count", # Field name
"product.template.attribute.line", # Model name
"product_template_attribute_line", # Table name
"integer", # Odoo Field type (in lower case)
False, # [Optional] SQL type (if custom fields)
"product", # Module name
False, # [Optional] Default value
)])
openupgrade.logged_query(
env.cr,
"""
UPDATE product_template_attribute_line al
SET value_count = (
SELECT COUNT(*)
FROM product_attribute_value_product_template_attribute_line_rel
WHERE product_template_attribute_line_id = al.id
)""",
)
@openupgrade.migrate()
def migrate(env, version):
compute_product_template_attribute_line_value_count(env)
Notes
these scripts are about optimization. As a contributor of openupgrade for a module, if you do not have a problem of excessive duration, you can propose migration scripts _without_ such optimizations, especially when the SQL queries are complex to write.
Another contributor can always propose a PR for performance improvement, if he faces this problem.