Each time you run a geoprocessing tool, information about the tool and its parameters is generated and stored by default. This information is called geoprocessing history. Each geoprocessing tool that creates new output data or updates input data adds metadata about the execution of the geoprocessing tool. This includes the name of the tool, the location, and the parameters used.
In a production environment, the geoprocessing tools and various Python scripts may be executed on a daily basis (e.g., Compress and Analysis), which means that after a few years the metadata can grow very large (e.g., 30 - 100 MB). Such large metadata usually has a negative impact on the overall database performance. This article describes what you can do to address the problem and avoid it in the future.
Disable metadata logging
The Esri geoprocessing tools insert the metadata into the enterprise geodatabases by default. This behavior can be disabled in ArcGIS Desktop and ArcGIS Pro, or by using a Python script.
For script tools and standalone scripts (scripts that run outside of an ArcGIS application), you can enable or disable history logging using the SetLogHistory function. We recommend disabling logging whenever possible and especially for compress and analysis scripts by adding this line to the top of your script:
ArcGIS Desktop and ArcGIS Pro
In ArcCatalog or ArcMap, logging behavior can be disabled by unchecking the Logging option in the Geoprocessing > Geoprocessing Options dialog.
Delete existing metadata logs
We recommend to delete the metadata using a Python script. There are Python scripts for ArcMap as well as for ArcGIS Pro. If possible, we recommend performing the cleanup with ArcGIS Pro. You can find an example script at the bottom of this article.
Metadata and its impact on system performance
Large metadata influences the following processes, among others:
- Access to the database via ArcCatalog or ArcMap
Restart of WebOffice application
Start of GEONIS projects and MXDs
Which customers are affected by this problem
- All customers using ArcGIS Enterprise geodatabases (Oracle, SQL Server, PostgreSQL)
To determine how much space is currently being used, the metadata size can be queried. The following SQL query can be used to check the workspace and feature class metadata size:
MS SQL Server
SELECT TOP 20 datalength(documentation) as length, name
order by length DESC
SELECT Length(Documentation) laenge, name
where documentation is not null
and rownum <= 20
ORDER BY laenge DESC;
select length(xmlserialize(Content utde.sde.gdb_items.documentation as varchar)) as length, name
where documentation is not null
order by length desc
Result example (SQL Server):
In this example, "Length" corresponds to about 60 million bytes (60 MB).
ArcCatalog needed about 38 seconds to open this database connection. After clearing the metadata logs, the connection could be established in a few seconds. Performance gains can vary greatly depending on the environment and database system.
How To: Delete geoprocessing history from a geodatabase in ArcMap using Python
How To: Delete geoprocessing history from a geodatabase in ArcGIS Pro using Python
FAQ: Why is the connection slow to a SQL Server geodatabase?