Skip to content

Commit d418206

Browse files
committed
completed all documentation. completed check(). remaining task is to validate non-standard calendar nad move to fpa
1 parent 096b6aa commit d418206

File tree

16 files changed

+158
-411
lines changed

16 files changed

+158
-411
lines changed

554.R

Lines changed: 0 additions & 128 deletions
This file was deleted.

NAMESPACE

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ export(qoq)
2121
export(qoqtd)
2222
export(qtd)
2323
export(qtdopq)
24-
export(seq_date_sql)
2524
export(wow)
2625
export(wowtd)
2726
export(wtd)

R/methods.R

Lines changed: 18 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -106,6 +106,22 @@ S7::method(create_calendar,ti) <- function(x){
106106
#' @title Execute time-intelligence or segments class objects to return the underlying transformed table
107107
#' @name calculate
108108
#' @param x ti object
109+
#' @description
110+
#' The `calculate()` function takes an object created by a time function (like `ytd()`, `mtd()`, or `qtd()`) or a segment function (like `cohort()` or `abc()`) and executes the underlying transformation logic.
111+
#' It translates the function blueprint into an actionable query, returning the final data table.
112+
#'
113+
#' @details
114+
#' The TI and segment functions in **fpaR**—such as `ytd()` or `cohort()` and others—are designed to be **lazy and database-friendly**.
115+
#' They do not perform the heavy data transformation immediately.
116+
#' Instead, they return a blueprint object (of class `ti`,`segment_abc` or `segment_cohort`) that contains all the parameters and logic needed for the calculation.
117+
#'
118+
#' **`calculate()`** serves as the **execution engine**.
119+
#'
120+
#' When called, it interprets the blueprint and generates optimized R code or SQL code (using the `dbplyr` package) that is then executed efficiently on the data source, whether it's an in-memory `tibble` or a remote database backend (like `duckdb` or `snowflake`).
121+
#' This approach minimizes data transfer and improves performance for large datasets.
122+
#'
123+
#' The resulting table will be sorted by the relevant date column to ensure the correct temporal ordering of the calculated metrics.
124+
#'
109125
#'
110126
#' @returns dbi object
111127
#' @export
@@ -125,19 +141,9 @@ S7::method(calculate,ti) <- function(x){
125141
}
126142

127143

128-
#' @title Execute time-intelligence or segments class objects to return the underlying transformed table
144+
#' @rdname calculate
129145
#' @name calculate
130-
#' @param x segment object
131-
#'
132-
#' @returns dbi object
133146
#' @export
134-
#' @examples
135-
#' \dontrun{
136-
#' sales |>
137-
#' group_by(store_key) |>
138-
#' abc(category_values = c(.3,.5,.75,.85)) |>
139-
#' calculate()
140-
#'}
141147
S7::method(calculate,segment_cohort) <- function(x){
142148

143149
out <- x@fn@fn_exec(x)
@@ -148,19 +154,9 @@ S7::method(calculate,segment_cohort) <- function(x){
148154

149155

150156

151-
#' @title Execute time-intelligence or segments class objects to return the underlying transformed table
157+
#' @rdname calculate
152158
#' @name calculate
153-
#' @param x segment object
154-
#'
155-
#' @returns dbi object
156159
#' @export
157-
#' @examples
158-
#' \dontrun{
159-
#' sales |>
160-
#' group_by(store_key) |>
161-
#' abc(category_values = c(.3,.5,.75,.85)) |>
162-
#' calculate()
163-
#'}
164160
S7::method(calculate,segment_abc) <- function(x){
165161

166162
out <- x@fn@fn_exec(x) |>

R/utils-misc.R

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -681,7 +681,10 @@ utils::globalVariables(
681681
"week_start_date",
682682
"week_end_date",
683683
"day_of_week_label",
684-
"days_in_month"
684+
"days_in_month",
685+
"year_index",
686+
"week_index",
687+
"day"
685688
)
686689
)
687690

R/utils-sql.R

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,6 @@
88
#' @param end_date calendar end date in YYYY-MM-DD format
99
#' @param time_unit calendar table unit in 'day', 'week', 'month', 'quarter' or 'year'
1010
#' @param con database connection
11-
#' @export
1211
#' @return DBI object
1312
#' @keywords internal
1413
#' @examples

README.md

Lines changed: 39 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -10,15 +10,15 @@ status](https://www.r-pkg.org/badges/version/fpaR?svg=1.png)](https://CRAN.R-pro
1010

1111
## A Business Intelligence Toolkit for Financial Planning & Analysis (FP&A)
1212

13-
`fpaR` is a collection of business intelligence tools designed to
13+
This package is a collection of business intelligence tools designed to
1414
simplify common **financial planning and analysis (FP&A)** tasks such as
15-
time intelligence calculations, customer, vendor or product
16-
segmentation, and factor/variance analysis
15+
time intelligence calculations, group members segmentation and
16+
factor/variance analysis.
1717

1818
The package is inspired by best practices from a collection of blogs,
1919
books, industry research, and hands-on work experience, consolidating
2020
frequently performed business analyses into a fast, efficient, and
21-
reusable framework
21+
reusable framework.
2222

2323
In particular, the time intelligence functions are heavily inspired by
2424
[PowerBI DAX](https://www.sqlbi.com/) functions
@@ -29,15 +29,15 @@ Under the hood, these functions are built upon the great foundations of:
2929
- [duckdb](https://github.com/duckdb/duckdb-r)
3030
- [lubridate](https://lubridate.tidyverse.org/)
3131

32-
`fpaR` is designed to work with either tibbles or modern databases
33-
(DuckDB, Snowflake, SQLite, etc) with a unified syntax
32+
All functions are designed to work with either tibbles or modern
33+
databases (DuckDB, Snowflake, SQLite, etc) with a unified syntax.
3434

35-
Even if you are working with tibbles, most functions are optimized to
35+
Even if you are working with tibbles, all functions are optimized to
3636
leverage [DuckDB](https://github.com/duckdb/duckdb-r) for increased
3737
speed and performance[^1]
3838

39-
By default most `fpaR` function returns a lazy DBI object which you can
40-
return as a tibble with `dplyr::collect()`
39+
By default, all functions returns a lazy DBI object which you can return
40+
as a tibble with `dplyr::collect()`
4141

4242
## Key features & benefits
4343

@@ -50,52 +50,50 @@ return as a tibble with `dplyr::collect()`
5050

5151
## Installation
5252

53-
Install the development version of `fpaR` from GitHub:
53+
Install the development from GitHub:
5454

5555
``` r
5656
# Install using pak or install.package()
5757

5858
pak::pak("usrbinr/fpaR")
5959
```
6060

61-
## What is in fpaR?
61+
## What is in fpa?
6262

6363
> We recommend using the [Contoso](https://usrbinr.github.io/contoso/)
6464
> package for any practice analysis. The contoso datasets are fictional
65-
> business transaction of Contoso toy company which are helpful for
65+
> business transaction of the Contoso toy company which are helpful for
6666
> business intelligence related analysis
6767
6868
There are 3 main categories of functions:
6969

70-
- Time intelligence realted functions
70+
- Time intelligence related functions
7171
(<a href="#tbl-ti-fn" class="quarto-xref">Table 1</a>)
7272
- Categorization strategies
7373
(<a href="#tbl-abc-fn" class="quarto-xref">Table 2</a>)
7474
- Factor analysis (work in progress)
7575

7676
### Time intelligence
7777

78-
`fpaR` provides readily available functions for most time intelligence
78+
This is a collection of the most commonly used time intelligence
7979
analysis such as **Year-over-Year**(`yoy()`),
8080
**Month-to-Date**(`mtd()`), and **Current Year-to-Date over Previous
81-
Year-to-Date** (`ytdopy()`) analysis
81+
Year-to-Date** (`ytdopy()`) analysis.
8282

83-
These functions are designed to quickly answer most common time
84-
intelligence related analysis in a consistent, fast and transparent way
83+
These functions are designed to quickly answer questions in a
84+
consistent, fast and transparent way.
8585

8686
**Key benefits:**
8787

8888
- **Auto-fill missing dates**: Ensures no missing periods in your
8989
datasets so that right period comparisons are performed
9090

9191
- **Flexible calendar options**: Handle comparisons based on a
92-
**standard** or **5-5-4** fiscal calendar to accommodate different
93-
reporting frameworks
92+
**standard** or **non-standard** fiscal calendar to accommodate
93+
different reporting frameworks
9494

95-
- **Period imbalance indicator**: When comparing periods with dates
96-
imbalance, the time intelligence functions will alert you to the type
97-
and number of period imbalances to ensure you are aware of misleading
98-
likewise comparisons
95+
- **Clear definition**: Full transparency to the calculations that are
96+
performed with visibilty to any missing or incomplete date periods
9997

10098
Below is the full list of time intelligence functions:
10199

@@ -133,29 +131,31 @@ Table 1
133131

134132
</div>
135133

136-
### **Segmentation Strategies**
134+
------------------------------------------------------------------------
135+
136+
### **Classification Strategies**
137137

138138
#### ABC Classification
139139

140140
ABC classification is an business analysis technique that categorizes
141141
items (like products, customers, or suppliers) based on their relative
142-
contribution of a value. It facilities the Pareto Principle (the 80/20
143-
rule), suggesting that a small percentage of items contribute to the
144-
largest percentage of the total value
142+
contribution of a value. It expands upon the the Pareto Principle (the
143+
80/20 rule), allowing the user to determine which percentage of items or
144+
group members contribute to the largest percentage of the total value
145145

146-
The analysis will assign a letter category to each break point so that
147-
you can further focus your analysis
146+
You assign the break points for the categorization and the function will
147+
label each category with a letter value
148148

149149
#### Cohort
150150

151151
Cohort analysis is a type of behavioral analytics that takes data from a
152-
given group of users (a cohort) and tracks their activity over time. A
153-
cohort is typically defined by a shared starting characteristic, most
154-
commonly the time period in which the entities first interacted with the
155-
product or service
152+
given group of users (called a cohort) and tracks their activity over
153+
time. A cohort is typically defined by a shared starting characteristic,
154+
most commonly the time period in which the entities first interacted
155+
with the product or service
156156

157-
This allows you to understand retention, turnover and other attributes
158-
more clearly
157+
This allows you to understand retention, turnover and other cohort
158+
attributes more clearly
159159

160160
<div id="tbl-abc-fn">
161161

@@ -177,13 +177,15 @@ Table 2
177177
### **Factor / Variation Analysis**
178178

179179
Breaks down revenue or cost changes into **price, volume, and mix
180-
effects**.
180+
effects**
181+
182+
Work in progress.
181183

182184
------------------------------------------------------------------------
183185

184186
## Future capabilities
185187

186-
Is there a future capability that you want to see here? please open up a
188+
Is there a future capability that you want to see here? please open up
187189
[discussion](https://github.com/usrbinr/fpaR/discussions) on our github
188190
site
189191

0 commit comments

Comments
 (0)