当前位置: 首页 > news >正文

pyro ExponentialLR 如何设置优化器 optimizer的学习率 pytorch 深度神经网络 bnn,

 第一。pyro 不支持 “ReduceLROnPlateau” ,因为需要Loss作为输入数值,计算量大

第二 ,svi 支持 scheduler有点问题,

属于  pyro.optim.PyroOptim的有三个
AdagradRMSProp ClippedAdam DCTAdam,但是还是会报错,类似下面的错误

optimizer = pyro.optim.SGD
# 指数下降学习率
pyro_scheduler = pyro.optim.ExponentialLR({'optimizer': optimizer, 'optim_args': {'lr': learn_rate}, 'gamma': 0.1})
Traceback (most recent call last):File "/home/aistudio/bnn_pyro_fso_middle_2_16__256.py", line 441, in <module>svi = SVI(model, mean_field_guide, optimizer, loss=Trace_ELBO())File "/home/aistudio/external-libraries/pyro/infer/svi.py", line 72, in __init__raise ValueError(
ValueError: Optimizer should be an instance of pyro.optim.PyroOptim class.

正确的方法是 

optimizer = torch.optim.SGD
# 指数下降学习率
pyro_scheduler = pyro.optim.ExponentialLR({'optimizer': optimizer, 'optim_args': {'lr': learn_rate}, 'gamma': 0.1})# 设置ReduceLROnPlateau调度器
# 在这个例子中,`'min'`模式表示当验证集上的损失停止下降时,学习率会降低。`factor=0.1`表示学习率会乘以0.1,`
# patience=10`表示如果验证集损失在连续10个epochs内没有改善,则降低学习率。
# scheduler = ReduceLROnPlateau(optimizer, 'min', factor=0.5, patience=20, verbose=True)# svi = SVI(model, mean_field_guide, optimizer, loss=Trace_ELBO())
svi = SVI(model, mean_field_guide, pyro_scheduler, loss=Trace_ELBO())

这是一个用于动态生成参数的调整学习率的包装器,用于 `torch.optim.lr_scheduler` 对象。

    :param scheduler_constructor: 一个 `torch.optim.lr_scheduler` 的类
    :param optim_args: 一个包含优化器学习参数的字典,或者是一个返回此类字典的可调用对象。必须包含键 'optimizer',其值为 PyTorch 优化器的值
    :param clip_args: 一个包含 `clip_norm` 和/或 `clip_value` 参数的字典,或者是一个返回此类字典的可调用对象。

    例子::

        optimizer = torch.optim.SGD
        scheduler = pyro.optim.ExponentialLR({'optimizer': optimizer, 'optim_args': {'lr': 0.01}, 'gamma': 0.1})
        svi = SVI(model, guide, scheduler, loss=TraceGraph_ELBO())
        for i in range(epochs):
            for minibatch in DataLoader(dataset, batch_size):
                svi.step(minibatch)
            scheduler.step()
 

"""
A wrapper for :class:`~torch.optim.lr_scheduler` objects that adjusts learning rates
for dynamically generated parameters.:param scheduler_constructor: a :class:`~torch.optim.lr_scheduler`
:param optim_args: a dictionary of learning arguments for the optimizer or a callable that returnssuch dictionaries. must contain the key 'optimizer' with pytorch optimizer value
:param clip_args: a dictionary of clip_norm and/or clip_value args or a callable that returnssuch dictionaries.Example::optimizer = torch.optim.SGDscheduler = pyro.optim.ExponentialLR({'optimizer': optimizer, 'optim_args': {'lr': 0.01}, 'gamma': 0.1})svi = SVI(model, guide, scheduler, loss=TraceGraph_ELBO())for i in range(epochs):for minibatch in DataLoader(dataset, batch_size):svi.step(minibatch)scheduler.step()

pyro.optim.lr_scheduler的源代码

# Copyright (c) 2017-2019 Uber Technologies, Inc.
# SPDX-License-Identifier: Apache-2.0from typing import Any, Dict, Iterable, List, Optional, Union, ValuesViewfrom torch import Tensorfrom pyro.optim.optim import PyroOptimclass PyroLRScheduler(PyroOptim):"""A wrapper for :class:`~torch.optim.lr_scheduler` objects that adjusts learning ratesfor dynamically generated parameters.:param scheduler_constructor: a :class:`~torch.optim.lr_scheduler`:param optim_args: a dictionary of learning arguments for the optimizer or a callable that returnssuch dictionaries. must contain the key 'optimizer' with pytorch optimizer value:param clip_args: a dictionary of clip_norm and/or clip_value args or a callable that returnssuch dictionaries.Example::optimizer = torch.optim.SGDscheduler = pyro.optim.ExponentialLR({'optimizer': optimizer, 'optim_args': {'lr': 0.01}, 'gamma': 0.1})svi = SVI(model, guide, scheduler, loss=TraceGraph_ELBO())for i in range(epochs):for minibatch in DataLoader(dataset, batch_size):svi.step(minibatch)scheduler.step()"""def __init__(self,scheduler_constructor,optim_args: Union[Dict],clip_args: Optional[Union[Dict]] = None,):# pytorch schedulerself.pt_scheduler_constructor = scheduler_constructor# torch optimizerpt_optim_constructor = optim_args.pop("optimizer")# kwargs for the torch optimizeroptim_kwargs = optim_args.pop("optim_args")self.kwargs = optim_argssuper().__init__(pt_optim_constructor, optim_kwargs, clip_args)def __call__(self, params: Union[List, ValuesView], *args, **kwargs) -> None:super().__call__(params, *args, **kwargs)def _get_optim(self, params: Union[Tensor, Iterable[Tensor], Iterable[Dict[Any, Any]]]):optim = super()._get_optim(params)return self.pt_scheduler_constructor(optim, **self.kwargs)def step(self, *args, **kwargs) -> None:"""Takes the same arguments as the PyTorch scheduler(e.g. optional ``loss`` for ``ReduceLROnPlateau``)"""for scheduler in self.optim_objs.values():scheduler.step(*args, **kwargs)


http://www.mrgr.cn/news/17774.html

相关文章:

  • C++身份证实名认证-实名制-身份证三要素认证-身份认证-身份验真-接口
  • 欧科云链OKLink受邀参与WebX ,旗下EaaS助力项目方“弯道超车”
  • Qt定时器详解
  • nodejs程序如何确定哪个是主进程文件?
  • Nuxt3入门:路由系统(第4节)
  • 【GeoScenePro】Generic Server Error
  • 【Java】SpringBoot 单体项目创建 与 整合 Mybatis-Plus
  • 深度学习模型量化方法
  • sed awk 第二版学习(二)—— 正则表达式语法
  • DSADC、量化噪声、噪声整形
  • NFS 各个版本之间的比较
  • PHP跨越城市界限一站式招聘求职平台系统小程序源码
  • 偶然发现破局之路
  • 每天一个数据分析题(五百一十四)- 决策树算法
  • 【C++设计模式】(三)创建型模式:单例模式
  • [C++] C、C++类型转换
  • 23种设计模式之建造者模式
  • HyperMesh教程从入门到精通:HyperMesh模型管理
  • 026、架构_资源_LoadServer
  • 在Android开发中,WiFi总是断开连接应该怎么办?