我正在编写一个涉及CAS,jspring安全检查,重定向等的简单脚本.我想使用Kenneth Reitz的python请求,因为它是一项很棒的工作!但是,CAS需要通过SSL进行验证,因此我必须先通过该步骤.我不知道Python的要求是什么?这个SSL证书应该驻留在哪里?
Traceback (most recent call last):
File "./test.py", line 24, in <module>
response = requests.get(url1, headers=headers)
File "build/bdist.linux-x86_64/egg/requests/api.py", line 52, in get
File "build/bdist.linux-x86_64/egg/requests/api.py", line 40, in request
File "build/bdist.linux-x86_64/egg/requests/sessions.py", line 209, in request
File "build/bdist.linux-x86_64/egg/requests/models.py", line 624, in send
File "build/bdist.linux-x86_64/egg/requests/models.py", line 300, in _build_response
File "build/bdist.linux-x86_64/egg/requests/models.py", line 611, in send
requests.exceptions.SSLError: [Errno 1] _ssl.c:503: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
Run Code Online (Sandbox Code Playgroud) 我正在使用pyVmomi在Python2.6中编写脚本,并使用其中一种连接方法:
service_instance = connect.SmartConnect(host=args.ip,
user=args.user,
pwd=args.password)
Run Code Online (Sandbox Code Playgroud)
我收到以下警告:
/usr/lib/python2.6/site-packages/requests/packages/urllib3/connectionpool.py:734: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
InsecureRequestWarning)
Run Code Online (Sandbox Code Playgroud)
有趣的是,我没有使用pip安装urllib3(但它位于/usr/lib/python2.6/site-packages/requests/packages/urllib3/).
我按照这里的建议尝试过
import urllib3
...
urllib3.disable_warnings()
Run Code Online (Sandbox Code Playgroud)
但这并没有改变任何事情.
之后pip install openai
,当我尝试时import openai
,它显示此错误:
urllib3 的“ssl”模块是使用 LibreSSL 而不是 OpenSSL 编译的
我刚刚学习了一个关于使用 OpenAI API 的项目教程。但是当我到达第一步,即安装并导入 OpenAI 时,我陷入了困境。我试图找到该错误的解决方案,但一无所获。
这是我尝试导入 OpenAI 后的消息:
Python 3.9.6 (default, Mar 10 2023, 20:16:38)
[Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import openai
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/yule/Library/Python/3.9/lib/python/site-packages/openai/__init__.py", line 19, in <module>
from openai.api_resources import (
File "/Users/mic/Library/Python/3.9/lib/python/site-packages/openai/api_resources/__init__.py", line 1, in <module>
from openai.api_resources.audio import Audio # noqa: …
Run Code Online (Sandbox Code Playgroud) 我在使用python 2.7.9-2 amd64的Debian 8系统上遇到了问题:
marius@pydev:/usr/lib/python2.7/dist-packages/urllib3/contrib$ pip search doo
Traceback (most recent call last):
File "/usr/bin/pip", line 9, in <module>
load_entry_point('pip==1.5.6', 'console_scripts', 'pip')()
File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 356, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 2476, in load_entry_point
return ep.load()
File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 2190, in load
['__name__'])
File "/usr/lib/python2.7/dist-packages/pip/__init__.py", line 74, in <module>
from pip.vcs import git, mercurial, subversion, bazaar # noqa
File "/usr/lib/python2.7/dist-packages/pip/vcs/mercurial.py", line 9, in <module>
from pip.download import path_to_url
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 22, in <module>
import requests, six …
Run Code Online (Sandbox Code Playgroud) 我想写一段代码,如下所示:
from bs4 import BeautifulSoup
import urllib2
url = 'http://www.thefamouspeople.com/singers.php'
html = urllib2.urlopen(url)
soup = BeautifulSoup(html)
Run Code Online (Sandbox Code Playgroud)
但我发现我现在必须安装urllib3
包.
此外,我找不到任何教程或示例来了解如何重写上面的代码,例如,urllib3
没有urlopen
.
请问任何解释或示例?!
P/S:我正在使用python 3.4.
p={
'http':'http://my correct proxy here',
'https':'https://my correct proxy here'
}
self.response=requests.get(url=url,headers=self.headers,timeout=(6,15),proxies=p)
Run Code Online (Sandbox Code Playgroud)
然后它引发异常:
Traceback (most recent call last):
File "C:\Users\xyl13509876955\Desktop\Monitor\dicks.py", line 61, in send_request
self.response=requests.get(url=url,headers=self.headers,timeout=(6,15),proxies=p)
File "C:\Users\xyl13509876955\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\xyl13509876955\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\xyl13509876955\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\xyl13509876955\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "C:\Users\xyl13509876955\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\adapters.py", line 449, in send
timeout=timeout
File "C:\Users\xyl13509876955\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 696, in urlopen
self._prepare_proxy(conn)
File …
Run Code Online (Sandbox Code Playgroud) 是否看到
urllib3.connectionpool WARNING - Connection pool is full, discarding connection
Run Code Online (Sandbox Code Playgroud)
意味着我正在有效地丢失数据(因为连接丢失)
或者这
是否意味着连接被丢弃(因为池已满);但是,当连接池可用时,稍后会重试相同的连接吗?
我将数以亿计的项目通过REST API从Heroku上的云服务器上传到我的数据库到AWS EC2中的数据库.我正在使用Python,我不断在日志中看到以下INFO日志消息.
[requests.packages.urllib3.connectionpool] [INFO] Resetting dropped connection: <hostname>
Run Code Online (Sandbox Code Playgroud)
在我的代码继续执行之前,这种"重置丢弃的连接"似乎需要很多秒(有时30秒以上).
谢谢你的帮助.安德鲁.
我正在尝试使用依赖于Python请求的包与我的Python 2.7 shell中的API进行交互.事情是我的网络(大学图书馆)阻止了远程地址.
所以说到API,我会做以下事情:
~$ ssh -D 8080 name@myserver.com
Run Code Online (Sandbox Code Playgroud)
然后,在新的终端,在本地计算机:
~$ export http_proxy=socks5://127.0.0.1:8080 https_proxy=socks5://127.0.0.1:8080
Run Code Online (Sandbox Code Playgroud)
然后我在Python控制台中运行该程序但失败了:
~$ python
>>> import myscript
>>> id = '1213'
>>> token = 'jd87jd9'
>>> connect(id,token)
File "/home/username/.virtualenvs/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 518, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/home/username/.virtualenvs/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/home/username/.virtualenvs/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 585, in send
r = adapter.send(request, **kwargs)
File "/home/username/.virtualenvs/venv/local/lib/python2.7/site-packages/requests/adapters.py", line 370, in send
conn = self.get_connection(request.url, proxies)
File "/home/username/.virtualenvs/venv/local/lib/python2.7/site-packages/requests/adapters.py", line 273, in get_connection
proxy_manager = …
Run Code Online (Sandbox Code Playgroud) 我找到了几个关于这个问题的网页,但没有一个解决了我的问题.
即使我做了:
pip show
Run Code Online (Sandbox Code Playgroud)
我明白了:
/usr/local/lib/python2.7/dist-packages/requests/__init__.py:80: RequestsDependencyWarning: urllib3 (1.9.1) or chardet (2.3.0) doesn't match a supported version!
RequestsDependencyWarning)
Traceback (most recent call last):
File "/usr/bin/pip", line 9, in <module>
load_entry_point('pip==1.5.6', 'console_scripts', 'pip')()
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 480, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2691, in load_entry_point
return ep.load()
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2322, in load
return self.resolve()
File "/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2328, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
File "/usr/lib/python2.7/dist-packages/pip/__init__.py", line 74, in <module>
from pip.vcs import git, mercurial, subversion, …
Run Code Online (Sandbox Code Playgroud) urllib3 ×10
python ×8
pip ×2
python-2.7 ×2
chardet ×1
connection ×1
debian ×1
http ×1
openai-api ×1
pool ×1
pyopenssl ×1
python-2.6 ×1
pyvmomi ×1
socks ×1
ssl ×1
web-scraping ×1