This Bugzilla instance is a read-only archive of historic NetBeans bug reports. To report a bug in NetBeans please follow the project's instructions for reporting issues.
This issue resulted from the Tag Editor Support API Inception Review. There are known performance problems while updating the model for larger HTML and JSP files. These problems should be analyzed and solved.
I have change the model update behaviour in the following way: 1) The document is not locked during the model update. If there is a change during the model update the update is cancelled and the changes thrown away. The changes will be taken into account in the next turn. This allows the user to write into the document even if the model is update for a longer time. The disadvantage of this approach is that the parsed data are thrown away even if they are almost complete. I will add a logic to prevent this in the future - not time for it now. 2) I have improved the incremental update mechanism itself. Once the model is created it is nearly never needed to reprse the entire document again. The parser skips all pieces of the document which 'it thinks' are unnecessary to reparse. This works quite nice even for very big files => the model update is done in a few milliseconds even for 1MB XML files (the size of the document doesn't play much significant role). I consider this TCA as fixed.
Additional comment: To allow #2 I have changed the DocumentModelProvider SPI a bit. Now there is following method: public void updateModel(DocumentModelModificationTransaction trans, DocumentModel model, DocumentChange[] changes) throws DocumentModelException, DocumentModelTransactionCancelledException; The framework passes a list of document changes to the provider and it is up to it what parts of the model will be update and how. The method now throws DMTCE exception which happens when a the provider tries to manipulate the given transaction which had already been cancelled (this happens when a document change is done during the update).
Closing DevRev issue.