使用 .NET for Spark 对 DataFrame 进行递归计算

Posted

技术标签:

【中文标题】使用 .NET for Spark 对 DataFrame 进行递归计算【英文标题】:Recursive calculation on DataFrame using .NET for Spark 【发布时间】:2021-04-02 09:27:34 【问题描述】:

我想使用 .NET for Spark 计算 RSI。

RSI 的公式为:

RSI = 100 - 100/(1 + R)S

RS = Average Gain / Average Loss

第一个平均收益和平均损失是 14 个周期的平均值:

First Average Gain = Sum of Gains over the past 14 periods / 14.
First Average Loss = Sum of Losses over the past 14 periods / 14

接下来的所有计算都是基于之前的平均值 以及当前的收益损失:

Average Gain = [(previous Average Gain) x 13 + current Gain] / 14.

Average Loss = [(previous Average Loss) x 13 + current Loss] / 14.

数据在DataFrame rsiCalcPos5,看起来像这样:

+--------------------+-----+------+----+-----+------------------+-------------------+----------+------------------+--------------------+-------------------+-------------------+------------------+
|      TimeSeriesType|Year0|Month0|Day0|Hour0|        avg(Value)|          Timestamp|  UnixTime|         nextValue|          deltaValue|               gain|               loss|             gain1|
+--------------------+-----+------+----+-----+------------------+-------------------+----------+------------------+--------------------+-------------------+-------------------+------------------+
|Current Available...| 2021|     3|   3|    9| 219.8235294117647|2021-03-03 09:00:00|1614758400|218.59733449857987| -1.2261949131848269|                0.0| 1.2261949131848269|               0.0|
|Current Available...| 2021|     3|   3|   10|218.59733449857987|2021-03-03 10:00:00|1614762000|185.59442632671212| -33.002908171867745|                0.0| 33.002908171867745|               0.0|
|Current Available...| 2021|     3|   3|   11|185.59442632671212|2021-03-03 11:00:00|1614765600| 190.5523781944545|   4.957951867742366|  4.957951867742366|                0.0|1.6526506225807889|
|Current Available...| 2021|     3|   3|   12| 190.5523781944545|2021-03-03 12:00:00|1614769200|187.88173813444055| -2.6706400600139375|                0.0| 2.6706400600139375|1.2394879669355916|
|Current Available...| 2021|     3|   3|   13|187.88173813444055|2021-03-03 13:00:00|1614772800| 187.6245558053521|-0.25718232908846517|                0.0|0.25718232908846517|0.9915903735484732|
|Current Available...| 2021|     3|   3|   14| 187.6245558053521|2021-03-03 14:00:00|1614776400|186.56644553819817| -1.0581102671539213|                0.0| 1.0581102671539213|0.8263253112903944|
|Current Available...| 2021|     3|   3|   15|186.56644553819817|2021-03-03 15:00:00|1614780000|186.66761484852796| 0.10116931032979437|0.10116931032979437|                0.0|0.7227315968674516|
|Current Available...| 2021|     3|   3|   16|186.66761484852796|2021-03-03 16:00:00|1614783600|165.79466929911155| -20.872945549416414|                0.0| 20.872945549416414|0.6323901472590201|
|Current Available...| 2021|     3|   3|   17|165.79466929911155|2021-03-03 17:00:00|1614787200|178.60478239401849|  12.810113094906939| 12.810113094906939|                0.0|1.9854704747754555|
|Current Available...| 2021|     3|   3|   18|178.60478239401849|2021-03-03 18:00:00|1614790800| 215.3916108565386|  36.786828462520106| 36.786828462520106|                0.0| 5.465606273549921|
|Current Available...| 2021|     3|   3|   19| 215.3916108565386|2021-03-03 19:00:00|1614794400|221.27369459516595|   5.882083738627358|  5.882083738627358|                0.0| 5.503467861284233|
|Current Available...| 2021|     3|   3|   20|221.27369459516595|2021-03-03 20:00:00|1614798000|231.88854705635575|  10.614852461189798| 10.614852461189798|                0.0|  5.92941657794303|
|Current Available...| 2021|     3|   3|   21|231.88854705635575|2021-03-03 21:00:00|1614801600|238.82354991634134|  6.9350028599855875| 6.9350028599855875|                0.0| 6.006769368869381|
|Current Available...| 2021|     3|   3|   22|238.82354991634134|2021-03-03 22:00:00|1614805200|240.02948909258865|  1.2059391762473126| 1.2059391762473126|                0.0| 5.663852926539233|
|Current Available...| 2021|     3|   3|   23|240.02948909258865|2021-03-03 23:00:00|1614808800|240.92351533915001|  0.8940262465613671| 0.8940262465613671|                0.0|              null|
|Current Available...| 2021|     3|   4|    0|240.92351533915001|2021-03-04 00:00:00|1614812400|239.63160854893138| -1.2919067902186328|                0.0| 1.2919067902186328|              null|
|Current Available...| 2021|     3|   4|    1|239.63160854893138|2021-03-04 01:00:00|1614816000|240.48959521094642|  0.8579866620150369| 0.8579866620150369|                0.0|              null|
|Current Available...| 2021|     3|   4|    2|240.48959521094642|2021-03-04 02:00:00|1614819600|192.37784787942516|  -48.11174733152126|                0.0|  48.11174733152126|              null|
|Current Available...| 2021|     3|   4|    3|192.37784787942516|2021-03-04 03:00:00|1614823200|192.96993537510536|  0.5920874956802038| 0.5920874956802038|                0.0|              null|
|Current Available...| 2021|     3|   4|    4|192.96993537510536|2021-03-04 04:00:00|1614826800|193.60104726861024|  0.6311118935048796| 0.6311118935048796|                0.0|              null|
+--------------------+-----+------+----+-----+------------------+-------------------+----------+------------------+--------------------+-------------------+-------------------+------------------+

我已经计算了gainloss 以及第一个平均增益(gain1 = 5.663852926539233,因为计算 RSI 的时间间隔是 14)。

现在我在计算其他平均收益时遇到了问题,从第 15 行开始。该公式是递归的,我不确定如何实现它。 到目前为止,我尝试了窗口函数,但没有得到正确的结果。

WindowSpec windowRSI3 = Microsoft.Spark.Sql.Expressions.Window
     .PartitionBy("TimeSeriesType")
     .OrderBy("Year0", "Month0", "Day0", "Hour0");
DataFrame rsiCalcPos6 = rsiCalcPos5.WithColumn("avgGainj", When(Col("gain1").IsNull(),
      (Lag(Col("gain1"), 1, 0).Multiply(13 / 14).Minus((Col("gain").Multiply(-1 / 14))
      .Over(windowRSI3)))).Otherwise(Col("gain1")));

在这里,我得到一个例外:

org.apache.spark.sql.AnalysisException: 窗口函数不支持表达式 '(gain#175 * cast(0 as double))'。

我要使用的递归公式需要一次计算一个avgGainj,并在计算下一个avgGain(j+1)时使用这个结果。

任何建议将不胜感激。谢谢!

【问题讨论】:

【参考方案1】:

我不确定我的公式是否完全正确,但我会这样处理:

using System;
using System.Collections.Generic;
using Microsoft.Spark.Sql;
using Microsoft.Spark.Sql.Expressions;
using Microsoft.Spark.Sql.Types;

namespace ***

    class Program
    
        static void Main(string[] args)
        
            var spark = SparkSession.Builder().GetOrCreate();

            var df = spark.CreateDataFrame(new List<GenericRow>()
            
                new GenericRow(new object[]1, 0.0, 1.226),
                new GenericRow(new object[]2, 0.0, 33.09),
                new GenericRow(new object[]3, 3.3, 0.0),
            new GenericRow(new object[]4, 0.0, 2.67),
            new GenericRow(new object[]5, 0.0, 2.67),
            new GenericRow(new object[]6, 0.0, 2.67),
            new GenericRow(new object[]7, 7.7, 0.0),
            new GenericRow(new object[]8, 0.0, 2.67),
            new GenericRow(new object[]9, 9.9, 0.0),
            new GenericRow(new object[]10, 10.1, 0.0),
            new GenericRow(new object[]11, 11.11, 0.0),
            new GenericRow(new object[]12, 12.12, 0.0),
            new GenericRow(new object[]13, 13.13, 0.0),
            new GenericRow(new object[]14, 14.14, 0.0),
            new GenericRow(new object[]15, 15.15, 0.0),
            new GenericRow(new object[]16, 16.16, 0.0),
            new GenericRow(new object[]17, 17.17, 0.0),
            new GenericRow(new object[]18, 18.18, 0.0),
            new GenericRow(new object[]19, 19.19, 0.0)
            , new StructType(new List<StructField>()
            
                new StructField("Row", new IntegerType()),
                new StructField("Gain", new DoubleType()),
                new StructField("Loss", new DoubleType()),
            ));

            df.Show();
            
//First use a window of the last 14 rows
            var lastFourteenRowsWindow = Window.OrderBy(Functions.Desc("Row")).RowsBetween(0, 14);
            
//Save the sum of the last fourteen rows
            var lastFourteenGains = df.WithColumn("LastFourteenGains", Functions.Sum("Gain").Over(lastFourteenRowsWindow));

//calculate the average of those (there is also an avg function you could use instead of sum/14)
            var averageGain =
                lastFourteenGains.WithColumn("AverageGain", Functions.Col("LastFourteenGains") / 14);

//create second window that doesn't have the 14 requirement
            var rowWindow = Window.OrderBy(Functions.Desc("Row"));

//use the new window to retrieve the previous gain
            var previousGains = averageGain.WithColumn("PreviousAverageGain",
                Functions.Lead("AverageGain", 1).Over(rowWindow));

//Previous Gain / 13 + (Sum(Last 14 Gains)/14)
            var result = previousGains.WithColumn("CurrentAverageGains",
                ((Functions.Col("PreviousAverageGain") / 13) + Functions.Col("AverageGain")) / 14);

            result.Show();
        
    


如果您在每个阶段之间执行 .Show(),您可以验证它是否正确。

编辑

【讨论】:

艾德,谢谢你的回答。第一个平均收益和损失应该取前 14 行,而不是最后 14 行。无论如何,我在 for 循环中计算了 RSI。我知道这在 Spark 中远非有效,但到目前为止我找不到更好的解决方案。感谢您的帮助

以上是关于使用 .NET for Spark 对 DataFrame 进行递归计算的主要内容,如果未能解决你的问题,请参考以下文章

.NET for Apache® Spark? 开源大数据分析工具

.NET for Apache Spark 预览版正式发布

.NET for Apache Spark 是预览版吗?

Spark机器学习

sparksql 自定义用户函数(UDF)

微软发布 .NET for Apache Spark 性能碾压Python、Scala和Java