Skip to content

Commit b282995

Browse files
Add intermediate representation struct (#810)
* Add intermediate representation struct * follow comments * update design doc
1 parent 995bc66 commit b282995

File tree

3 files changed

+150
-65
lines changed

3 files changed

+150
-65
lines changed

doc/design_intermediate_representation.md

Lines changed: 5 additions & 65 deletions
Original file line numberDiff line numberDiff line change
@@ -21,77 +21,17 @@ We propose the following code structures.
2121
sql/
2222
...
2323
codegen/
24+
feature_column.go
25+
intermediate_representation.go
2426
tensorflow/
25-
train.go
26-
predict.go
27-
analyze.go
27+
...
2828
xgboost/
2929
...
3030
```
3131

32-
The `tensorflow` package will expose function `func Train(ir sql.TrainIR) string, error`, which takes the `sql`'s `TrainIR` and returns a generated Python program.
32+
The IR and feature column definition will resides in `sql/codegen`. Each code generator package forms a subdirectory in `sql/codegen` like `sql/codegen/tensorflow/`.
3333

3434
## Intermediate Representation
3535

36-
We propose the following struct as the IR for code generation.
37-
38-
```go
39-
package sql
40-
41-
import (
42-
"github.com/sql-machine-learning/sqlflow/sql/columns"
43-
)
44-
45-
type FieldType int
46-
47-
const (
48-
Int FieldType = iota
49-
Float
50-
String
51-
)
52-
53-
// FieldMeta contains the meta information for decoding and feature columns
54-
type FieldMeta struct {
55-
DType FieldType // e.g. "float", "int32"
56-
Delimiter string // e.g. ","
57-
Shape []int // e.g. [1], [1 2 3]
58-
IsSparse bool // e.g. false
59-
FeatureColumn []columns.FeatureColumn // e.g. [EmbeddingColumn, CategoryIDColumn]
60-
}
61-
62-
// TrainIR is the intermediate representation for code generation of a training job
63-
type TrainIR struct {
64-
DataSource string // e.g. "hive://root:root@localhost:10000/churn"
65-
Select string // e.g. "select * from iris.train"
66-
ValidationSelect string // e.g. "select * from iris.val;"
67-
Estimator string // e.g. "DNNClassifier"
68-
Attribute map[string]interface{} // e.g. {"train.epoch": 1000, "model.hidden_units": [10 10]}
69-
Feature map[string]map[string]FieldMeta // e.g. {"feature_columns": {"sepal_length": {"float", "", [1], false}, ...}}
70-
Label map[string]FieldMeta // e.g. {"class": {"int32", "", [1], false}}
71-
}
72-
73-
// PredictIR is the intermediate representation for code generation of a prediction job
74-
type PredictIR struct {
75-
DataSource string // e.g. "hive://root:root@localhost:10000/churn"
76-
Select string // e.g. "select * from iris.test"
77-
Estimator string // e.g. "DNNClassifier"
78-
Attribute map[string]interface{} // e.g. {"predict.batch_size": 32}
79-
Feature map[string]map[string]FieldMeta // e.g. {"feature_columns": {"sepal_length": {"float", "", [1], false}, ...}}
80-
Label map[string]FieldMeta // e.g. {"class": {"int32", "", [1], false}}
81-
ReusltTable string // e.g. "iris.predict"
82-
}
83-
84-
// AnalyzeIR is the intermediate representation for code generation of a analysis job
85-
type AnalyzeIR struct {
86-
DataSource string // e.g. "hive://root:root@localhost:10000/churn"
87-
Select string // e.g. "select * from iris.train"
88-
Estimator string // e.g. "DNNClassifier"
89-
Attribute map[string]interface{} // e.g. {"analyze.plot_type": "bar"}
90-
Feature map[string]map[string]FieldMeta // e.g. {"feature_columns": {"sepal_length": {"float", "", [1], false}, ...}}
91-
Label map[string]FieldMeta // e.g. {"class": {"int32", "", [1], false}}
92-
}
93-
```
94-
95-
Please be aware that all the IR excludes the information of the current working directory. This information belongs to the `executor` in `sql` package. For a prediction/analyze job, the `executor` should recover everything produced by the training job.
36+
Please refer to [sql/codegen/intermediate_representation.go](sql/codegen/intermediate_representation.go) and [sql/codegen/feature_column.go](sql/codegen/intermediate_representation.go) for implementation details.
9637

97-
Please be aware that `TrainIR` excludes the saving table name. This information belongs to the `executor` in `sql` package.

sql/codegen/feature_column.go

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
// Copyright 2019 The SQLFlow Authors. All rights reserved.
2+
// Licensed under the Apache License, Version 2.0 (the "License");
3+
// you may not use this file except in compliance with the License.
4+
// You may obtain a copy of the License at
5+
//
6+
// http://www.apache.org/licenses/LICENSE-2.0
7+
//
8+
// Unless required by applicable law or agreed to in writing, software
9+
// distributed under the License is distributed on an "AS IS" BASIS,
10+
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11+
// See the License for the specific language governing permissions and
12+
// limitations under the License.
13+
14+
package codegen
15+
16+
// NumericColumn represents a dense tensor for the model input.
17+
//
18+
// FieldMeta indicates the meta information for decoding the field. Please be aware
19+
// that FieldMeta also contains information for dimension and data type.
20+
type NumericColumn struct {
21+
FieldMeta *FieldMeta
22+
}
Lines changed: 123 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,123 @@
1+
// Copyright 2019 The SQLFlow Authors. All rights reserved.
2+
// Licensed under the Apache License, Version 2.0 (the "License");
3+
// you may not use this file except in compliance with the License.
4+
// You may obtain a copy of the License at
5+
//
6+
// http://www.apache.org/licenses/LICENSE-2.0
7+
//
8+
// Unless required by applicable law or agreed to in writing, software
9+
// distributed under the License is distributed on an "AS IS" BASIS,
10+
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11+
// See the License for the specific language governing permissions and
12+
// limitations under the License.
13+
14+
package codegen
15+
16+
// FieldType indicates the field type of a table column
17+
type FieldType int
18+
19+
const (
20+
// Int indicates the corresponding table column is an integer
21+
Int FieldType = iota
22+
// Float indicates the corresponding table column is a float
23+
Float
24+
// String indicates the corresponding table column is a string
25+
String
26+
)
27+
28+
// FieldMeta contains the meta information for decoding. A field is a selected column of a SQL result.
29+
//
30+
// Name indicates the name for a field.
31+
//
32+
// DType indicates the data type for a field. For example: Int, Float, String.
33+
//
34+
// Delimiter indicates the decoding method of a field. For example, the field may
35+
// contain a string like "1,23,42" which represent a 3-D tensor [1, 23, 42].
36+
//
37+
// Shape indicates the shape of the tensor represented for a field. For exmaple, the
38+
// field may contain a string like "1,23,42" which represent a 3-D tensor, the shape
39+
// will be [3].
40+
//
41+
// IsSparse indicates the type of tensor for a field. True means the tensor is a sparse tensor.
42+
type FieldMeta struct {
43+
Name string `json:"name"` // e.g. "spetal_length"
44+
DType FieldType `json:"dtype"` // e.g. "float", "int32"
45+
Delimiter string `json:"delimiter"` // e.g. ","
46+
Shape []int `json:"shape"` // e.g. [1], [1 2 3]
47+
IsSparse bool `json:"is_sparse"` // e.g. false
48+
}
49+
50+
// FeatureColumn indicates the feature column to be applied on the field. Please refer to
51+
// github.com/sql-machine-learning/sqlflow/sql/codegen/feature_column.go for detailed list of all feature columns.
52+
type FeatureColumn interface{}
53+
54+
// Attribute represents an parsed entry in the WITH clause.
55+
type Attribute struct {
56+
Key string
57+
Value interface{}
58+
}
59+
60+
// TrainIR is the intermediate representation for code generation of a training job.
61+
//
62+
// Please be aware that the TrainIR intentionally excludes the model table name in the
63+
// INTO clause. The sql package will save the output files of a generated Python program.
64+
// For prediction and analysis jobs, the sql will restore an identical working directly.
65+
type TrainIR struct {
66+
// DataSource contains the connection information. For example, "hive://root:root@localhost:10000/churn"
67+
DataSource string
68+
// Select specifies the query for fetching the training data. For example, "select * from iris.train;".
69+
Select string
70+
// ValidationSelect specifies the query for fetching the validation data. For example, "select * from iris.val;".
71+
ValidationSelect string
72+
// Estimator specifies the estimator type. For example, after parsing "select ... train DNNClassifier WITH ...",
73+
// the Estimator will be "DNNClassifier".
74+
Estimator string
75+
// Attributes contain a list of parsed attribute in the WITH Clause. For example, after parsing
76+
// "select ... train ... with train.epoch = 1000, model.hidden_units = [10, 10]",
77+
// the Attributes will be {{"train.epoch", 1000}, {"model.hidden_units", [10 10]}}.
78+
Attributes []Attribute
79+
// Features contain a map of a list of feature columns in the COLUMN clause.
80+
// For multiple COLUMN clauses like
81+
// ```
82+
// column ... for deep_feature
83+
// column ... for wide_feature
84+
// ```
85+
// They will be parsed as {"deep_feature": {...}, "wide_feature": {...}}
86+
// For single column clause like "column ...", "feature_columns" will be used as the default map key.
87+
Features map[string][]FeatureColumn
88+
// Label specifies the feature column in the LABEL clause.
89+
Label FeatureColumn
90+
}
91+
92+
// PredictIR is the intermediate representation for code generation of a prediction job
93+
//
94+
// Please be aware the PredictionIR contains the result table name, so the
95+
// generated Python program is responsible to create and write the result table.
96+
type PredictIR struct {
97+
// DataSource contains the connection information. For example, "hive://root:root@localhost:10000/churn"
98+
DataSource string
99+
// Select specifies the query for fetching the prediction data. For example, "select * from iris.test;".
100+
Select string
101+
// ResultTable specifies the table to store the prediction result.
102+
ResultTable string
103+
// Attributes contain a list of parsed attribute in the WITH clause. For example, after parsing
104+
// "select ... predict ... with predict.batch_size = 32 into ...",
105+
// the Attributes will be {{"predict.batch_size", 32}}
106+
Attributes []Attribute
107+
// TrainIR is the TrainIR used for generating the training job of the corresponding model
108+
TrainIR TrainIR
109+
}
110+
111+
// AnalyzeIR is the intermediate representation for code generation of a analysis job
112+
type AnalyzeIR struct {
113+
// DataSource contains the connection information. For example, "hive://root:root@localhost:10000/churn"
114+
DataSource string
115+
// Select specifies the query for fetching the analysis data. For example, "select * from iris.test;".
116+
Select string
117+
// Attributes contain a list of parsed attribute in the WITH clause. For example, after parsing
118+
// "select ... analyze ... with analyze.plot_type = "bar"",
119+
// the Attributes will be {{"analyze.plot_type", "bar"}}
120+
Attributes []Attribute
121+
// TrainIR is the TrainIR used for generating the training job of the corresponding model
122+
TrainIR TrainIR
123+
}

0 commit comments

Comments
 (0)