这是我的代码:
x = 1.0
y = 100000.0
print x/y
Run Code Online (Sandbox Code Playgroud)
我的商显示为 1.00000e-05
有没有办法压制科学记数并使其显示为
0.00001?我将把结果用作字符串.
Day_Date,timeofday_desc,Timeofday_hour,Timeofday_minute,Timeofday_second,value
2017-12-18,12:21:02 AM,0,21,2,“1.779209040E+08”
2017-12-19,12:21:02 AM,0,21,2,“1.779209040E+08”
2017-12-20,12:30:52 AM,0,30,52,“1.779209040E+08”
2017-12-21,12:30:52 AM,0,30,52,“1.779209040E+08”
2017-12-22,12:47:10 AM,0,47,10,“1.779209040E+08”
2017-12-23,12:47:10 AM,0,47,10,“1.779209040E+08”
2017-12-24,02:46:59 AM,2,46,59,“1.779209040E+08”
2017-12-25,02:46:59 AM,2,46,59,“1.779209040E+08”
2017-12-26,03:10:27 AM,3,10,27,“1.779209040E+08”
2017-12-27,03:10:27 AM,3,10,27,“1.779209040E+08”
2017-12-28,03:52:08 AM,3,52,8,“1.779209040E+08”
Run Code Online (Sandbox Code Playgroud)
我正在尝试将value列转换为177920904
val df1 = df.withColumn("s", 'value.cast("Decimal(10,4)")).drop("value").withColumnRenamed("s", "value")
Run Code Online (Sandbox Code Playgroud)
还尝试将值转换为Float, Double。始终将 null 作为输出
df1.select("value").show()
+-----------+
| value |
+-----------+
| null|
| null|
| null|
| null|
| null|
| null|
| null|
| null|
Run Code Online (Sandbox Code Playgroud)
df.printSchema
root
|-- Day_Date: string (nullable = true)
|-- timeofday_desc: string (nullable = true)
|-- Timeofday_hour: string (nullable …Run Code Online (Sandbox Code Playgroud) 我有一个火花聚合,我想将结果输出到 csv,但我发现火花总是以科学记数法输出大量小数。我已经尝试过这个问题中提到的解决方案,但也没有奏效。
预期输出:
foo,avg(bar)
a,0.0000002
b,0.0000001
Run Code Online (Sandbox Code Playgroud)
实际输出:
foo,avg(bar)
a,2.0E-7
b,1.0E-7
Run Code Online (Sandbox Code Playgroud)
请参阅下面的示例:
foo,avg(bar)
a,0.0000002
b,0.0000001
Run Code Online (Sandbox Code Playgroud)
在 pyspark 外壳中:
foo,avg(bar)
a,2.0E-7
b,1.0E-7
Run Code Online (Sandbox Code Playgroud)