tokenize一个字符串,保留Python中的分隔符

for*_*ran 17 python string split tokenize

str.split在Python中是否有任何等效的返回分隔符?

我需要在处理一些令牌后保留输出的空白布局.

例:

>>> s="\tthis is an  example"
>>> print s.split()
['this', 'is', 'an', 'example']

>>> print what_I_want(s)
['\t', 'this', ' ', 'is', ' ', 'an', '  ', 'example']
Run Code Online (Sandbox Code Playgroud)

谢谢!

Jon*_*erg 19

怎么样

import re
splitter = re.compile(r'(\s+|\S+)')
splitter.findall(s)
Run Code Online (Sandbox Code Playgroud)


Den*_*ach 6

>>> re.compile(r'(\s+)').split("\tthis is an  example")
['', '\t', 'this', ' ', 'is', ' ', 'an', '  ', 'example']
Run Code Online (Sandbox Code Playgroud)