C RUBY-ON-RAILS MYSQL ASP.NET DEVELOPMENT RUBY .NET LINUX SQL-SERVER REGEX WINDOWS ALGORITHM ECLIPSE VISUAL-STUDIO STRING SVN PERFORMANCE APACHE-FLEX UNIT-TESTING SECURITY LINQ UNIX MATH EMAIL OOP LANGUAGE-AGNOSTIC VB6 MSBUILD

# How to use lag function to calculate next observation in SAS

By : Vidhun Kumar
Date : November 19 2020, 12:41 AM
This might help you The problem with using lag is when you use lag1(Cal) you're not getting the last value of Cal that was written to the output dataset, you're getting the last value that was passed to the lag1 function. It would probably be easier to use a retain as follows:
code :
``````data want(drop=Cal_l:);
set have;
retain Cal_l1 Cal_l2;

if missing(Cal) then Cal = Cal_l1 * &coef1 + Cal_l2 * &coef2;

Cal_l2 = Cal_l1;
Cal_l1 = Cal;
run;
``````

Share :

## How to calculate value for an observation by group?

By : user2841530
Date : March 29 2020, 07:55 AM
wish of those help I have a data frame like so: , How about:
code :
``````mydf\$score - tapply(mydf\$score, mydf\$group, mean)[as.character(mydf\$group)]
``````

## Calculate average excluding current observation

By : Cesar Velica
Date : March 29 2020, 07:55 AM
To fix this issue Although you could switch to SUMPRODUCT so as to be able to create an explicit reference to the row in which the formula resides (not possible within SUMIF), I'd probably prefer the slightly longer:
=(SUMIFS(A:A,D:D,D3,E:E,E3,AP:AP,AP3)-A3)/(COUNTIFS(D:D,D3,E:E,E3,AP:AP,AP3)-1)

## Calculate difference between observation in dependence of value of another variable

By : user2322520
Date : March 29 2020, 07:55 AM
I wish this helpful for you I have an app record and want to calculate the time between two specific events. , You were very close. I think you need something like
code :
``````library(dplyr)

appdata %>%
group_by(userid, dayid) %>%
summarise(usagetime_in_sec = sum(datesec[activity == "appclose"] -
datesec[activity == "appstart"]))

#   userid dayid usagetime_in_sec
#    <dbl> <dbl>            <dbl>
#1      1    32             1459
``````

## Calculate for each row for how many columns is observation within top X%

By : user2513548
Date : March 29 2020, 07:55 AM
I hope this helps . Here's a solution, maybe not the most elegant way, or the most optimal, but it works. Hope it helps:
code :
``````# For each value column, indicate the outliers
for col in df.columns[1:]:
df[f'{col}_outliers_pos'] = np.where(df[col] >= df[col].quantile(0.95), 1, 0)
df[f'{col}_outliers_neg'] = np.where(df[col] <= df[col].quantile(0.05), 1, 0)

# Create lists for positive and negative columns
pos_cols = [col for col in df.columns if 'pos' in col]
neg_cols = [col for col in df.columns if 'neg' in col]

# Calculate the sum of both negative and positive
df['sum_out_positive'] = df[pos_cols].sum(axis=1)
df['sum_out_negative'] = df[neg_cols].sum(axis=1)

# Drop columns we dont need to get correct output
df.drop(pos_cols + neg_cols, axis=1, inplace=True)

print(df)
ID  v1  v2  v3  sum_out_positive  sum_out_negative
0  a   1   2   0                 0                 2
1  b   2   3   0                 0                 1
2  c   1   6   1                 1                 1
3  d   3   1   2                 0                 0
4  e   4   0   3                 0                 1
5  f   5   2   5                 2                 0
``````

## Calculate Log Difference For Each Day in R Produce NA for the First Observation for Each Day

By : Arindam
Date : March 29 2020, 07:55 AM
hope this fix your issue Problem: Calculate the difference in log for each day (group by each day). The ideal result should produce NA for the first observation for each day. , Never use \$ in dplyr pipes, also you need to append NA to diff output
code :
``````library(dplyr)

df %>%
mutate(day = lubridate::day(t)) %>%
group_by(day) %>%
mutate(logdif = c(NA, diff(log(v))))

#   t                 v   day     logdif
#  <chr>            <dbl> <int>     <dbl>
#1 2019-10-01 09:30  105.     1       NA
#2 2019-10-01 09:35  105.     1 -0.00110
#3 2019-10-01 09:40  105.     1 -0.000562
#4 2019-10-02 09:30  105.     2       NA
#5 2019-10-02 09:35  104.     2 -0.00110
#6 2019-10-02 09:40  105.     2  0.00137
#7 2019-10-03 09:30  105.     3       NA
#8 2019-10-03 09:35  104.     3 -0.00820
#9 2019-10-03 09:40  104.     3 -0.00110
``````