我试图通过用户名将用户添加到我的频道。我正在使用python 3.6 telethon库和pythonanywhere服务器:
api_hash = 'b7**'
phone = '+7***'
client = TelegramClient('new_ses', api_id, api_hash)
client.connect()
client = TelegramClient('session_neworig', api_id, api_hash,)
client.connect()
from telethon.tl.functions.channels import InviteToChannelRequest
from telethon.tl.functions.contacts import ResolveUsernameRequest
from telethon.tl.types import InputPeerChannel ,InputPeerUser,InputUser
from telethon.tl.functions.channels import JoinChannelRequest
chann=client.get_entity('channelname') #its public channel
print(chann.id)
1161823752
print(chann.access_hash)
8062085565372622341
time.sleep(30)
chan=InputPeerChannel(chann.id, chann.access_hash)
user = client(ResolveUsernameRequest('Chai***'))
print(user.users[0].id)
193568760
print(user.users[0].access_hash)
-4514649540347033311
time.sleep(1*30)
user=InputUser(user.users[0].id,user.users[0].access_hash,)
client.invoke(InviteToChannelRequest(chan,[user]))
Run Code Online (Sandbox Code Playgroud)
这项艰巨的工作,我得到了 -telethon.errors.rpc_error_list.PeerFloodError: (PeerFloodError(...), 'Too many requests')
我究竟做错了什么?如何避免呢?这段代码为我工作,但是添加了20个用户后,我就大吃一惊了:
from telethon.helpers import get_input_peer
client.invoke(InviteToChannelRequest(
get_input_peer(client.get_entity(chan),
[get_input_peer(client.get_entity(user))]
))
Run Code Online (Sandbox Code Playgroud)
请帮忙,如何通过用户名添加200个用户而没有任何禁止,也许还有另一种通过python做到这一点的方法?另一个库还是通过api?
我有一个 python 脚本,它执行 gbq 作业以将 csv 文件 f 加载到 BigQuery 中的表中。我尝试以 csv 格式上传数据并收到以下错误:
400 Invalid schema update. Cannot add fields (field: string_field_8)
Run Code Online (Sandbox Code Playgroud)
这是我的 csv:
id,first_name,username,last_name,chat_username,chat_id,forward_date,message_text
231125223|Just|koso|swissborg_bounty|-1001368946079|1517903147|tes
481895079|Emerson|EmersonEmory|swissborg_bounty|-1001368946079|1517904387|pictu
316560356|Ken Sam|ICOnomix|swissborg_bounty|-1001368946079|1517904515|Today
Run Code Online (Sandbox Code Playgroud)
这是我的代码:
from google.cloud.bigquery import Client
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '***.json'
os.environ['GOOGLE_CLOUD_DISABLE_GRPC'] = 'True'
from google.cloud import bigquery
dataset_name = 'test_temporary_dataset'
table_name='table_telega'
bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset(dataset_name)
table = dataset.table(table_name)
job_config = bigquery.LoadJobConfig()
job_config.source_format = 'text/csv'
job_config.skip_leading_rows = 1
job_config.autodetect = True
job_config.fieldDelimiter='|'
job_config.allow_jagged_rows=True
job_config.ignoreUnknownValues=True
job_config.allow_quoted_newlines=True
with open('**.csv', 'rb') as source_file: …Run Code Online (Sandbox Code Playgroud) 我有一个python脚本执行一个gbq作业,将csv文件f加载到BigQuery上的表.所有数据写入一列,但我希望它被加载到每一列.我尝试自动检测,但它也没有帮助.我的csv:
id,first_name,username,last_name,chat_username,chat_id,forward_date,message_text
231125223~Just~koso~swissborg_bounty~-1001368946079~1517903147~test
481895079~Emerson~EmersonEmory~swissborg_bounty~-1001368946079~1517904387~picture
316560356~Ken Sam~ICOnomix~swissborg_bounty~-1001368946079~1517904515~Today
Run Code Online (Sandbox Code Playgroud)
这是我的代码:
from google.cloud.bigquery import Client
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '***.json'
os.environ['GOOGLE_CLOUD_DISABLE_GRPC'] = 'True'
from google.cloud import bigquery
dataset_name = 'test_temporary_dataset'
table_name='table_telega'
bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset(dataset_name)
table = dataset.table(table_name)
#table.reload()
job_config = bigquery.LoadJobConfig()
job_config.source_format = 'text/csv'
job_config.skip_leading_rows = 1
job_config.autodetect = True
job_config.allow_jagged_rows=True
job_config.allow_quoted_newlines=True
job_config.fieldDelimiter='~'
with open('tele2.csv', 'rb') as source_file:
#job = table.upload_from_file(source_file, source_format='text/csv')
job=bigquery_client.load_table_from_file(source_file, table, job_config=job_config)
job.result()
Run Code Online (Sandbox Code Playgroud)
如何逐列正确加载csv