You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| abap.raw| cds.binary||| default: cds.String(2 \* raw-length) - later we have to discuss how to encode e.g. images or for which data types we use cds.UUID (max 36) (for a dedicated list of abap data types) - for cds.UUID use rules from OData Data Types|
45
-
| abap.fltp| cds.double||| cds.Double|
46
-
| abap.string| cds.largestring||| cds.String length is either given or blank|
47
-
| abap.lchr| cds.largestring||| cds.String length is either given or blank|
48
-
| abap.lraw| cds.largebinary||| not supported |
49
-
| abap.rawstring|cds.largebinary ||| not supported|
50
-
| abap.geom_ewkb|cds.largebinary ||| not supported|
11
+
| ABAP DataType | CDS Datatype | Properties |Spark Type | ABAP Format |Comment || Transformer|
| abap.char(1) (*"@Semantic.booleanIndicator: true"*) | cds.String|length = 1 | STRING(1) || We can't enforce the right values - therefore we must use string || -|
- DataSphere Data Types coming from here: [Link](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/7b1dc6e0fad147de8e50aa8dc4744aa3.html?locale=en-US)
|`cds.String` (length ) | STRING |`cds.String`| Datasphere Logic: IF `cds.String(length = undefined)` THEN `cds.String(length = 5000)`|
17
-
|`cds.LargeString`| STRING |`cds.LargeString`|TODO: Check this. No length Limit?|
18
-
|`cds.Integer`| INT |`cds.Integer`||
19
-
|`cds.Integer64`| BIGINT |`cds.Integer64`||
20
-
|`cds.Decimal` (precision, scale)| DECIMAL(p,s) |`cds.Decimal`| Datasphere Logic: IF `cds.Decimal(p < 17)` THEN `cds.Decimal(p = 17)`|
21
-
|`cds.Decimal` (precision = 34, scale = floating) |***not supported***|`cds.DecimalFloat`| Decimal with scale = floating is not supported in spark |
22
-
|Amounts with Currencies `cds.Decimal` (precision = 34, scale = 4) |`cds.Decimal(34, 4)`|`cds.Decimal(34, 4)`| Since spark does not support `cds.DecimalFloat` we use cds.Decimal(34,4) as compromise for now |
23
-
|`cds.Double` (precision, scale)| DECIMAL(p,s)|`cds.Double`|Datasphere Logic: IF `cds.Double (precision, scale)` THEN `cds.Double()` (delete precision and scale)|
24
-
|`cds.Date`| DATE |`cds.Date`||
25
-
|`cds.Time` must be expressed as `cds.String(6)` or `cds.String(12)` depending on the source representation for now + the annotation `@Semantics.time: true`| STRING |`cds.String(6)` or `cds.String(12)`| Data is in format `HHmmss` or `HH:mm:ss.SSS` - consumer must use the function to_time() to convert to `cds.Time`|
|`cds.Decimal` (precision = p, scale = s)| DECIMAL(p,s) |`cds.Decimal`| Datasphere Logic: IF `cds.Decimal(p < 17)` THEN `cds.Decimal(p = 17)`||
21
+
|`cds.Decimal` (precision = p, scale = floating) |***not supported***|`cds.Decimal`| Decimal with scale = floating is not supported in spark||
22
+
|Amounts with Currencies `cds.Decimal` (precision = 34, scale = 4) |`cds.Decimal(34, 4)`|`cds.Decimal(34, 4)`| Since spark does not support `cds.DecimalFloat` we use cds.Decimal(34,4) as compromise for now ||
23
+
|`cds.Double`| DOUBLE|`cds.Double`|||
24
+
|`cds.Date`| DATE |`cds.Date`|| "yyyyMMdd" |
25
+
|`cds.Time` must be expressed as `cds.String(6)` or `cds.String(12)` depending on the source representation for now + the annotation `@Semantics.time: true`| STRING |`cds.String(6)` or `cds.String(12)`| Data is in format `HHmmss` or `HH:mm:ss.SSS` - consumer must use the function to_time() to convert to `cds.Time`||
0 commit comments