<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="6.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">D. Eckhard</style></author><author><style face="normal" font="default" size="100%">A. S. Bazanella</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">On the global convergence of identification of output error models</style></title><secondary-title><style face="normal" font="default" size="100%">18th {IFAC} World congress</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">2011</style></year></dates><publisher><style face="normal" font="default" size="100%">IFAC</style></publisher><pub-location><style face="normal" font="default" size="100%">Milan</style></pub-location><pages><style face="normal" font="default" size="100%">9058–9063</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;The Output Error Method is related to an optimization problem based on a multi-modal criterion. Iterative algorithms like the steepest descent are usually used to look for the global minimum of the criterion. This algorithms can get stuck at a local minimum. This paper presents sufficient conditions about the convergence of the steepest descent algorithm to the global minimum of the cost function. Moreover, it presents constraints to the input spectrum which ensure that the convergence conditions are satisfied. This constraints are convex and can easily be included in an experiment design approach to ensure the convergence of the iterative algorithms to the global minimum of the criterion.&lt;/p&gt;
</style></abstract><notes><style face="normal" font="default" size="100%">n/a</style></notes></record></records></xml>