在MACOS M1上安装ML代理:故障排除指南ð�ð»
#python #machinelearning #deeplearning

如果您是像我这样的AI和ML爱好者,那么您可能会在尝试在Mac M1上安装ML代理Python库时遇到挑战。不用担心,我经历了痛苦ð¥²ð,想分享我的经验,使您的过程更加顺利。 ðρ

步骤1:创建虚拟环境

首先,让我们使用venv工具创建一个虚拟环境。打开终端并执行以下命令:

python -m venv rl-env

步骤2:安装ML代理

现在,让我们使用pip安装ML-Agesent。在虚拟环境中运行此命令:

python -m pip install mlagents==0.30.0

步骤3:检查安装

要确保正确安装ML代理,请运行以下命令:

mlagents-learn --help

如果您没有遇到任何错误,请考虑自己幸运ð *!
但是,如果您遇到下面的错误,请放心,我得到了您的支持。

Traceback (most recent call last):
  File "/Users/alpha/miniforge3/envs/rl-venv/bin/mlagents-learn", line 5, in <module>
    from mlagents.trainers.learn import main
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents/trainers/learn.py", line 2, in <module>
    from mlagents import torch_utils
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents/torch_utils/__init__.py", line 1, in <module>
    from mlagents.torch_utils.torch import torch as torch  # noqa
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents/torch_utils/torch.py", line 6, in <module>
    from mlagents.trainers.settings import TorchSettings
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents/trainers/settings.py", line 25, in <module>
    from mlagents.trainers.cli_utils import StoreConfigFile, DetectDefault, parser
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents/trainers/cli_utils.py", line 5, in <module>
    from mlagents_envs.environment import UnityEnvironment
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents_envs/environment.py", line 12, in <module>
    from mlagents_envs.side_channel.side_channel import SideChannel
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents_envs/side_channel/__init__.py", line 5, in <module>
    from mlagents_envs.side_channel.default_training_analytics_side_channel import (  # noqa
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents_envs/side_channel/default_training_analytics_side_channel.py", line 7, in <module>
    from mlagents_envs.communicator_objects.training_analytics_pb2 import (
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/mlagents_envs/communicator_objects/training_analytics_pb2.py", line 35, in <module>
    _descriptor.FieldDescriptor(
  File "/Users/alpha/miniforge3/envs/rl-venv/lib/python3.9/site-packages/google/protobuf/descriptor.py", line 561, in _new_
    _message.Message._CheckCalledFromGeneratedFile()
TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates

步骤4:降级Protobuf

要解决Protobuf问题,您需要降级它。运行以下命令:

pip install protobuf==3.20.0

步骤5:重新检查安装

现在,检查是否正确安装了ML代理:

mlagents-learn --help

步骤6:解决更多错误

您可能会遇到与平面名称空间中未找到的符号有关的另一个错误。这是由于默认情况下在Apple Silicon Mac上没有可用的核心框架。

Traceback (most recent call last):
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/bin/mlagents-learn", line 33, in <module>
    sys.exit(load_entry_point('mlagents==0.30.0', 'console_scripts', 'mlagents-learn')())
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/bin/mlagents-learn", line 25, in importlib_load_entry_point
    return next(matches).load()
  File "/Users/alpha/miniforge3/lib/python3.10/importlib/metadata/__init__.py", line 171, in load
    module = import_module(match.group('module'))
  File "/Users/alpha/miniforge3/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/mlagents/trainers/learn.py", line 2, in <module>
    from mlagents import torch_utils
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/mlagents/torch_utils/__init__.py", line 1, in <module>
    from mlagents.torch_utils.torch import torch as torch  # noqa
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/mlagents/torch_utils/torch.py", line 6, in <module>
    from mlagents.trainers.settings import TorchSettings
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/mlagents/trainers/settings.py", line 25, in <module>
    from mlagents.trainers.cli_utils import StoreConfigFile, DetectDefault, parser
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/mlagents/trainers/cli_utils.py", line 5, in <module>
    from mlagents_envs.environment import UnityEnvironment
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/mlagents_envs/environment.py", line 49, in <module>
    from .rpc_communicator import RpcCommunicator
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/mlagents_envs/rpc_communicator.py", line 1, in <module>
    import grpc
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/grpc/__init__.py", line 22, in <module>
    from grpc import _compression
  File "/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/grpc/_compression.py", line 20, in <module>
    from grpc._cython import cygrpc
ImportError: dlopen(/Users/alpha/Files/dev/reinforcement-learning-unity/rl-env/lib/python3.10/site-packages/grpc/_cython/cygrpc.cpython-310-darwin.so, 0x0002): symbol not found in flat namespace '_CFDataGetBytes'

要解决此问题,请使用以下命令从源重建GRPCIO:

pip uninstall grpcio
export GRPC_PYTHON_LDFLAGS=" -framework CoreFoundation"
pip install grpcio --no-binary :all:

包起来

恭喜,您已经成功导航了安装过程!不要让这些打ic劝阻您;故障排除是开发人员旅程不可或缺的一部分。