mirror of
https://github.com/valitydev/salt.git
synced 2024-11-08 17:33:54 +00:00
salt-ssh: fix JSON load of return data when it contains non-ascii
For reasons I can't explain, in `salt.utils.json.find_json()` using `.splitlines()` will sometimes convert a unicode string into a list of str types on Python 2. So, that's weird. This can be triggered in salt-ssh whenever there are non-ascii chars in the return data. [DEBUG] raw = u'{"local": {"return": {"foo": "\xf6\xe4\xfc"}}}' [DEBUG] raw.splitlines() = ['{"local": {"return": {"foo": "\xc3\xb6\xc3\xa4\xc3\xbc"}}}'] To resolve this, the UnicodeDecodeError is caught and reattempted with a decoded list. Additionally, this fixes a performance oversight. We process the string one line at a time but we are iterating a number of times equal to the length of the string. This means that we will nearly always end up doing a bunch of extra list slices resulting in empty lists, which when joined and loaded will produce ValueErrors, which we are catching and ignoring. By enumerating over the split string, we ensure that we only iterate at most a number of times equal to the amount of lines in the string.
This commit is contained in:
parent
158d8028d5
commit
096bcb3ca9
@ -25,8 +25,13 @@ def find_json(raw):
|
||||
string to start with garbage and end with json but be cleanly loaded
|
||||
'''
|
||||
ret = {}
|
||||
for ind, _ in enumerate(raw):
|
||||
working = '\n'.join(raw.splitlines()[ind:])
|
||||
lines = raw.splitlines()
|
||||
for ind, _ in enumerate(lines):
|
||||
try:
|
||||
working = '\n'.join(lines[ind:])
|
||||
except UnicodeDecodeError:
|
||||
working = '\n'.join(salt.utils.data.decode(lines[ind:]))
|
||||
|
||||
try:
|
||||
ret = json.loads(working) # future lint: blacklisted-function
|
||||
except ValueError:
|
||||
|
Loading…
Reference in New Issue
Block a user