From 9fee344221d88d150c1c190e855fd684373eed39 Mon Sep 17 00:00:00 2001 From: Brian Wren Date: Thu, 12 Sep 2024 11:10:17 -0700 Subject: [PATCH 1/4] DCR structure update --- .../data-collection-rule-structure.md | 159 +++++++++++------- .../dcr-flow-diagram.png | Bin 0 -> 10999 bytes 2 files changed, 99 insertions(+), 60 deletions(-) create mode 100644 articles/azure-monitor/essentials/media/data-collection-rule-structure/dcr-flow-diagram.png diff --git a/articles/azure-monitor/essentials/data-collection-rule-structure.md b/articles/azure-monitor/essentials/data-collection-rule-structure.md index 2f79de51ed..7f14d6fed1 100644 --- a/articles/azure-monitor/essentials/data-collection-rule-structure.md +++ b/articles/azure-monitor/essentials/data-collection-rule-structure.md @@ -9,48 +9,98 @@ ms.reviwer: nikeist --- # Structure of a data collection rule in Azure Monitor -[Data collection rules (DCRs)](data-collection-rule-overview.md) are sets of instructions that determine how to collect and process telemetry sent to Azure Monitor. Some DCRs will be created and managed by Azure Monitor. This article describes the JSON properties of DCRs for creating and editing them in those cases where you need to work with them directly. +[Data collection rules (DCRs)](data-collection-rule-overview.md) are sets of instructions that determine how to collect and process telemetry sent to Azure Monitor. Some DCRs will be created and managed by Azure Monitor. This article describes the JSON structure of DCRs for creating and editing them in those cases where you need to work with them directly. - See [Create and edit data collection rules (DCRs) in Azure Monitor](data-collection-rule-create-edit.md) for details working with the JSON described here. - See [Sample data collection rules (DCRs) in Azure Monitor](../essentials/data-collection-rule-samples.md) for sample DCRs for different scenarios. ## Properties -Properties at the top level of the DCR. +The following table describes properties at the top level of the DCR. | Property | Description | |:---|:---| -| `immutableId` | A unique identifier for the data collection rule. Property and value are automatically created when the DCR is created. | -| `description` | A description of the data collection rule. | -| `dataCollectionEndpointId` | Resource ID of the [data collection endpoint (DCE)](data-collection-endpoint-overview.md) used by the DCR if you provided one. Property not present in DCRs that don't use a DCE. | +| `description` | Optional description of the data collection rule defined by the user. | +| `dataCollectionEndpointId` | Resource ID of the [data collection endpoint (DCE)](data-collection-endpoint-overview.md) used by the DCR if you provided one when the DCR was created. This property isn't present in DCRs that don't use a DCE. | +| `endpoints`1 | Contains the `logsIngestion` and `metricsIngestion` URL of the endpoints for the DCR. This section and its properties are automatically created when the DCR is created only if the `kind` attribute in the DCR is `Direct`. | +| `immutableId` | A unique identifier for the data collection rule. This property and its value are automatically created when the DCR is created. | +| `kind` | Specifies the data collection scenario the DCR is used for. This parameter is further described below. | +1This property wasn't created for DCRs created before March 31, 2024. DCRs created before this date required a [data collection endpoint (DCE)](data-collection-endpoint-overview.md) and the `dataCollectionEndpointId` property to be specified. If you want to use these embedded DCEs then you must create a new DCR. -## `endpoints` -Contains the URLs of the endpoints for the DCR. This section and its properties are automatically created when the DCR is created. +## Kind +The `kind` property in the DCR specifies the type of collection that the DCR is used for. Each kind of DCR has a different structure and properties. + +The following table lists the different kinds of DCRs and their details. + +| Kind | Description | +|:---|:---| +| `Direct` | Direct ingestion using Logs ingestion API. Endpoints are created for the DCR only if this kind value is used. | +| `AgentDirectToStore` | | +| `AgentSettings` | | +| `Linux` | Collect events and performance data from Linux machines. | +| `PlatformTelemetry` | | +| `Windows` | Collect events and performance data from Windows machines. | +| `WorkspaceTransforms` | Workspace transformation DCR. This DCR doesn't include an input stream. | + + + +## Overview of DCR data flow +The basic flow of a DCR is shown in the following diagram. Each of the components is described in the following sections. + +:::image type="content" source="media/data-collection-rule-structure/dcr-flow-diagram.png" lightbox="media/data-collection-rule-structure/dcr-flow-diagram.png" alt-text="Diagram that illustrates the relationship between the different sections of a DCR." border="false"::: + + +## Input streams +The input stream section of a DCR defines the incoming data that's being collected. There are two types of incoming stream, depending on the particular data collection scenario. Most data collection scenarios use one of the input streams, while some may use both. > [!NOTE] -> These properties weren't created for DCRs created before March 31, 2024. DCRs created before this date required a [data collection endpoint (DCE)](data-collection-endpoint-overview.md) and the `dataCollectionEndpointId` property to be specified. If you want to use these embedded DCEs then you must create a new DCR. +> Workspace transformation DCRs don't have an input stream since the -| Property | Description | +| Input stream | Description | |:---|:---| -| `logsIngestion` | URL for ingestion endpoint for log data. | -| `metricsIngestion` | URL for ingestion endpoint for metric data. | +| `dataSources` | Known type of data. This is often data processed by Azure Monitor agent and delivered to Azure Monitor using a known data type. | +| `streamDeclarations` | Custom data that needs to be defined in the DCR. | + +Data sent from the Logs ingestion API uses a `streamDeclaration` with the schema of the incoming data. This is because the the API sends custom data that can have any schema. + +Text logs from AMA are an example of data collection that requires both `dataSources` and `streamDeclarations`. The data source includes the configuration + -**Scenarios** -- Logs ingestion API +### Data sources +Data sources are unique sources of monitoring data that each has its own format and method of exposing its data. Each data source type has a unique set of parameters that must be configured for each data source. The data returned by the data source is typically a known type, so the schema doesn't need to be defined in the DCR. -## `dataCollectionEndpointId` -Specifies the [data collection endpoint (DCE)](data-collection-endpoint-overview.md) used by the DCR. + For example, events and performance data collected from a VM with the Azure Monitor agent (AMA), use data sources such as `windowsEventLogs` and `performanceCounters`. You specify criteria for the events and performance counters that you want to collect, but you don't need to define the structure of the data itself since this is a known schema for potential incoming data. -**Scenarios** -- Azure Monitor agent -- Logs ingestion API -- Events Hubs - +#### Common parameters -## `streamDeclarations` +All data source types share the following common parameters. + +| Parameter | Description | +|:---|:---| +| `name` | Name to identify the data source in the DCR. | +| `streams` | List of streams that the data source will collect. If this is a standard data type such as a Windows event, then the stream will be in the form `Microsoft-`. If it's a custom type, then it will be in the form `Custom-` | + +#### Valid data source types + +The data source types currently available are listed in the following table. + +| Data source type | Description | Streams | Parameters | +|:---|:---|:---|:---| +| `eventHub` | Data from Azure Event Hubs. | Custom1 | `consumerGroup` - Consumer group of event hub to collect from. | +| `iisLogs` | IIS logs from Windows machines | `Microsoft-W3CIISLog` |`logDirectories` - Directory where IIS logs are stored on the client. | +| `logFiles` | Text or json log on a virtual machine | Custom1 | `filePatterns` - Folder and file pattern for log files to be collected from client.
`format` - *json* or *text* | +| `performanceCounters` | Performance counters for both Windows and Linux virtual machines | `Microsoft-Perf`
`Microsoft-InsightsMetrics` | `samplingFrequencyInSeconds` - Frequency that performance data should be sampled.
`counterSpecifiers` - Objects and counters that should be collected. | +| `prometheusForwarder` | Prometheus data collected from | `streams`
`labelIncludeFilter` +| `syslog` | Syslog events on Linux virtual machines | `Microsoft-Syslog` | `facilityNames` - Facilities to collect
`logLevels` - Log levels to collect | +| `windowsEventLogs` | Windows event log on virtual machines | `Microsoft-Event` | `xPathQueries` - XPaths specifying the criteria for the events that should be collected. | +| `extension` | Extension-based data source used by Azure Monitor agent. | Varies by extension | `extensionName` - Name of the extension
`extensionSettings` - Values for each setting required by the extension | + +1 These data sources use both a data source and a stream declaration since the schema of the data they collect can vary. The stream used in the data source should be the custom stream defined in the stream declaration. + +### Stream declarations Declaration of the different types of data sent into the Log Analytics workspace. Each stream is an object whose key represents the stream name, which must begin with *Custom-*. The stream contains a full list of top-level properties that are contained in the JSON data that will be sent. The shape of the data you send to the endpoint doesn't need to match that of the destination table. Instead, the output of the transform that's applied on top of the input data needs to match the destination shape. -This section isn't used for data sources sending known data types such as events and performance data sent from Azure Monitor agent. +#### Data types The possible data types that can be assigned to the properties are: @@ -62,58 +112,47 @@ The possible data types that can be assigned to the properties are: - `dynamic` - `datetime`. -**Scenarios** -- Azure Monitor agent (text logs only) -- Logs ingestion API -- Event Hubs -## `destinations` -Declaration of all the destinations where the data will be sent. Only `logAnalytics` is currently supported as a destination except for Azure Monitor agent which can also use `azureMonitorMetrics`. Each Log Analytics destination requires the full workspace resource ID and a friendly name that will be used elsewhere in the DCR to refer to this workspace. -**Scenarios** -- Azure Monitor agent (text logs only) -- Logs ingestion API -- Event Hubs -- Workspace transformation DCR +## Destinations +The `destinations` section includes an entry for each destination where the data will be sent. These destinations are matched with input streams in the `dataFlows` section. -## `dataSources` -Unique source of monitoring data that has its own format and method of exposing its data. Each data source has a data source type, and each type defines a unique set of properties that must be specified for each data source. The data source types currently available are listed in the following table. +### Common parameters -| Data source type | Description | +| Parameters | Description | |:---|:---| -| eventHub | Data from Azure Event Hubs | -| extension | VM extension-based data source, used exclusively by Log Analytics solutions and Azure services ([View agent supported services and solutions](../agents/azure-monitor-agent-overview.md#supported-services-and-features)) | -| logFiles | Text log on a virtual machine | -| performanceCounters | Performance counters for both Windows and Linux virtual machines | -| syslog | Syslog events on Linux virtual machines | -| windowsEventLogs | Windows event log on virtual machines | +| `name` | Name to identify the destination in the `dataSources` section. | + +### Valid destinations + +The destinations currently available are listed in the following table. -**Scenarios** -- Azure Monitor agent -- Event Hubs +| Destination | Description | Required parameters | +|:---|:---|:---| +| `logAnalytics` | Log Analytics workspace | `workspaceResourceId` - Resource ID of the workspace.
`workspaceID` - ID of the workspace

This only specifies the workspace, not the table where the data will be sent. If it's a known destination, then no table needs to be specified. For custom tables, the table is specified in the data source. | +| `azureMonitorMetrics` | Azure Monitor metrics | No configuration is required since there is only a single metrics store for the subscription. | +| `storageTablesDirect` | Azure Table storage | `storageAccountResourceId` - Resource ID of the storage account
`tableName` - Name of the table | +| `storageBlobsDirect` | Azure Blob storage | `storageAccountResourceId` - Resource ID of the storage account
`containerName` - Name of the blob container | +| `eventHubsDirect` | Event Hubs | `eventHubsDirect` - Resource ID of the event hub. | -## `dataFlows` -Matches streams with destinations and optionally specifies a transformation. -### `dataFlows/Streams` -One or more streams defined in the previous section. You may include multiple streams in a single data flow if you want to send multiple data sources to the same destination. Only use a single stream though if the data flow includes a transformation. One stream can also be used by multiple data flows when you want to send a particular data source to multiple tables in the same Log Analytics workspace. -### `dataFlows/destinations` -One or more destinations from the `destinations` section above. Multiple destinations are allowed for multi-homing scenarios. +## Data flows +Data flows match input streams with destinations. Each data source may optionally specify a transformation and in some cases will specify a specific table in the Log Analytics workspace. + +### Data flow properties + +| Section | Description | +|:---|:---| +| `streams` | One or more streams defined in the input streams section. You may include multiple streams in a single data flow if you want to send multiple data sources to the same destination. Only use a single stream though if the data flow includes a transformation. One stream can also be used by multiple data flows when you want to send a particular data source to multiple tables in the same Log Analytics workspace. | +| `destinations` | One or more destinations from the `destinations` section above. Multiple destinations are allowed for multi-homing scenarios. | +| `transformKql` | Optional [transformation](data-collection-transformations.md) applied to the incoming stream. The transformation must understand the schema of the incoming data and output data in the schema of the target table. If you use a transformation, the data flow should only use a single stream. | +| `outputStream` | Describes which table in the workspace specified under the `destination` property the data will be sent to. The value of `outputStream` has the format `Microsoft-[tableName]` when data is being ingested into a standard table, or `Custom-[tableName]` when ingesting data into a custom table. Only one destination is allowed per stream.

This property isn't used for known data sources from Azure Monitor such as events and performance data since these are sent to predefined tables. | -### `dataFlows/transformKql` -Optional [transformation](data-collection-transformations.md) applied to the incoming stream. The transformation must understand the schema of the incoming data and output data in the schema of the target table. If you use a transformation, the data flow should only use a single stream. -### `dataFlows/outputStream` -Describes which table in the workspace specified under the `destination` property the data will be sent to. The value of `outputStream` has the format `Microsoft-[tableName]` when data is being ingested into a standard Log Analytics table, or `Custom-[tableName]` when ingesting data into a custom table. Only one destination is allowed per stream.

This property isn't used for known data sources from Azure Monitor such as events and performance data since these are sent to predefined tables. | -**Scenarios** -- Azure Monitor agent -- Logs ingestion API -- Event Hubs -- Workspace transformation DCR diff --git a/articles/azure-monitor/essentials/media/data-collection-rule-structure/dcr-flow-diagram.png b/articles/azure-monitor/essentials/media/data-collection-rule-structure/dcr-flow-diagram.png new file mode 100644 index 0000000000000000000000000000000000000000..0945d124cc5cd8689b717fc54e944d8512d81e5e GIT binary patch literal 10999 zcmc(F2T)U88)lFuy(7INLTJ*Y_bv)jLg*UlRhkq*q=()NNLM-%z|cXO6a|7PT|hv3 zN9j$%UM%1D|GPW0Gdnx8nIR;(_n!0CbKdtkPn6zW4H7~ILJ$Z%|qfpCc9Y9djhx8gzjH zgKE&7ouL+6Adu(+FKYMr*w^3R-`Dqeds^n?LJqjPy}kWO;UWNdJ$d3}1_HIqLU|#; zqfZ|_dgSiTP9Tb@bbgvyEver!<8;@sSWprBkH z0gy1hBZ@<2WP7{926X>jR!)wkxj7UMbl-CXwlr=P6nKGYxx5f?ev0s^0JSrfbVti3 zB^Mi{Vm-s%n{sPoW@7q#`z#h3)E;tjel#0|Ik7{7K+dFHgdpNjDAu#|A518Zz}>r& zjdh!bUoCQ(m>tj^gS|`H=~(v}$iX0CDhYnbwxCfw6UmLFFD0Ck<%jssl9Z63cAXbx z&?uLpkhdFX&Mw0C8a4>%{`F=rh zh4sE{6L*{*EP3su-1vR!B5QO?#qRFn9JDlb6|Kw8+30;$?3n-->Iz77&Inf6;p#<9 z<4|rKMRTwbttS3ejvXSx!^7UdqaGJ0jr=6{l|8eL~B*Pcb7{z^rlL`VZngZM84`CRkj~w9K?WkDdmFzRPr`b zb&f4(uy_R2dHPailcA@7+9EzD$l*rj`qjq-C2PSl;LvQ7QSoPm&n_huZi4dO`@sIB zl#Iicka%=(Hz!EU@O%fSvUS2#1qnjp{2G_U2l5p^d^3;(vnPJq%c%82{`1w+lIOVH zQ${tWuBad1wW9NK?gyerxgfsRTx&QXAV|m`kM8P7PGev!-e63Y_9-s5x=~t2)I-jP6K*D0KC!t((HfpC{9-&M?+iN1r39;b=aLP=|7Lw z^lxo#?dk;VSyCa^qotQ zbrg{`?>jp?F&NDGdCSrRtR63PK?25~^|y9~;uRGYKz0dW$0%L5D4G%B{1qzCpbl}> z)zt?r3zW7oI!!t)HtK(jHN>#t8$q&X$LurmEBeX7XZ1b6Pzg|H8mk)q(P%W^fZT*7} zg9(Ki#7n;F*~@_S6an9u<_DrEIl0?@XA&TI1|DjNL6j1F3X5Bh{-OV|@EVv;SUEVa zazXXg5XHH_>no34xu#}8@#rA;)$AKD49(XHvPgpC3+!GcBW6t?jKo+W4AHa~$p)X? zE`9%N`kRa>{f&fYQHgBVm^66S7H@@E&3-IUrd3i-CRA`9=+YE`h?rX9OEqpEu)mM8 zU8@sLV3YwgwWsUScIsC{ere3zmR@x3`D#&|N#QfioVGfD7xSPV{=9MO#*p@CRupkD zzsIyG(#z+z2e-IAHiz2C{lKc13^$6F-F3MyBNGU;n)l{i+J)qO&8-p1h={zy94P%` z@aJ*+nr}Bw8$HG?M-f?D;?N_h-t=VX%=9!xFJ;y(bZRu>!v~GE#!aaYA3h8Py#J-Lwf-{6FJ7?ox2K zv#HIT859-*3!{=V7@Mk-ulUH7GU~*y4fNS~+Z84_6e5&;Hh^%Jn1LmUUr&2~a&mGQUB1MotyUNQE>}tgu%UBTG}GEiVRC*7 z+LugO*WZebdu~~+#s@REZ?#l(Xh^QCQToUyTj3?A?OO#(`fV*Mq@l|sbLwtw$!GlH zpy&vR509e|U(Ebko&3JwS~5(IZKzXmIJ`+(q-hA=!cs&nN1~cLu15KaR4o$lm4&^;iY=EqImU2K| zs*!^$ZB%bks7Q&e!*}ekU`zBh5^IbPU?|Gg1Q_XTI>J5MtXtH4IsBrP5SUyE*__In z3s$k}5!EDK`dp$A-)6xE$$57pMSi<)qSafn6gt?2d>)-BQ=ehi&>u5AJpA*4^4|yD z=7p%ih=qg&w`*{O-H$#kA>&CiVH4Vzf?bt;6>FMb>e_T|R4mnup<+1w0Jbp{1(DXRX?FXm# z%5?TO8pHb@vlzbB6I%|<|A|@RJjW~zJ+!$Sw5TLHIgi&SPLJ9peytTG=w4ec`QEbs zQV2d&o?*TrarUX#(yNjN*{$`>k!bUIp50F;_NP%V=vieS3)qh2%r8CjJ|(C>VH4#` z;9IBprJQVLgM(pLne;Xp>9iL=J;Bq!dRP2i98a)gmlcU7 zRkgc%H+9c}d!<&pS&*A1LZ)d+uk4|16J4-U?vPV>j6QBmj}gTO7Xv0HrY<8c|0ULo zV|`t-F{#T^eb2X zFROBht1z3ZnUl$LMz0RVGVl72QQ&oHhnM$Q2Gq96X3&r;<)hBTeQ~lge(zPZjmR)Q-_sm=LH58yvBc zmsa1_9@8uz$Y1fe_5(?~)lsM`uVJlH!?E}9M55W%s2i60f@dps+k7T@4peLqyf>mS zO{{A%9s8wO$M2xGKQ)so4+ve0S^fTOF^^`*&%4H4F4cO_dE-$D%@bNPK zpVb9r_n#l$wJ~+~E8`k>p!493;xwgY^v-(LRB%FBl`tve_T4DVWJ`i|9<3PCU$Myv zN8pA){RHQt$DE|i->==_oueEFEAR&MY$K76szdxk;qL{Xx=u1vmPY-^Gw7SVt#{K) z+3KF&r;jema0J#@F8rqE@=8ixy5~7d+Ivc6;?SCUW$DUz$dLqNj-!o#xokNU70D?HX z^!{;h@}sE8jkl^>ts%?C4TT|&fJCdpBVJzLwUt}Z@1tRf*t|1g#v6FI*t_~b5>{ZE zAJGJ5g?7las&hmNv&B74QmvKIL74s!s-U%6WF-1#DI~`f`31ffI;?6Zn7w<Lczu_rfXXMEW>wHDF1j>gpp(oSvvy1J>e(^Y?PE zz@N_;nz+@Ttgs4VMqS?-x{?ptKZ)NyY}}SfVm+zzpR`3jo+LA9fibQ-x`4X+$X)%V zf4=^)%J>MsusA_vUod5JHTraJIPc_M4P{(Q6(Y`liMNYTC)PYQRT;@o1AN2=vIjvZ zN$cmG==uOhdu`2TXRoVIdyO9%U4)2G-CKu19{a|zK^X9V*-MxNPV)kY-vwt)q~x4{ zwaqNo&WXA&I(r1r6KSau_{clJ#`Mb#D?waM~k2r|-nS!Y5#0tDu zr%O$bL7bnTXJ#^LJqmLM3WHxm_A4T$=sa(hWOW8vAWp+sPf+=cu^5c)?rlQ5b^nw-*?TyBquVPy&zCq(d&&)>k zbhw;eXLP2suXuJ8@|U|TeIq4GQht&N>3{zs^1(fb zdw)^vIp{DoP13qpE6%k()L8uSLvJzMW+tL5itLCi=(scY;-}q)m^1qyGxf-eV52xR zN-<0jpN)>b>`bl={RG;w3g6{)3p~LP5*%a4Jw=u6O4yon)zw{|oX+fT>!Z7Ir9pgK z%3gt}*n|i7{2vIE_Y_>5e-BE#&{&+RcYE^WNvarbQ=~B5A}rL9WF{4OC*6~cIX~N|?Nm?K zvKlAyslx+m7?V_5+~RYA(~41QFb$za9Nh!968*OajQR$L^h0>%zD7#R1q%zq5eNC`%FmqzF(>!AC)wP%d}BLv972E)@lJ3cN&w#zVX zz=MB-H56)c{g;(5^PH?}?A@y4uI0CgM zued3nA$xtOeQJT@SM>k6G4{@R+;usLlc*))U%Foo>A=+0_hkgeJ$rTV6aR-<{diTO zBdz6c5+3`-k&3&9^?W-k*G~^{rAGRxjCgGlL29^7X%p#7>s2R^t%6@%2W2gN>#A|)_d&Zb98vPs2L+l+Vc{1Sg*sg3>t8_^y8AwU{J8Qd_o2TlP$&aS>ff1pVi09z zW!c%xL0SETkjXm0I)7DHMRDNwz6HT>$8i1~i4$fM6r{jMGxzNqo28F}Dqm~-c#A*; zq7F#Kf0WB(!5r4N0nL#*Y8Nfk*0r%6*&HO$Se~}~K>;%;O=1T4@~>!|Fg2X8pKaCE zoFVt?jG4eLQ-GcSS~L%u)b;UW+$v(i(j8gj0{G^yIeTGhlhW_{DmdLA@ z*g*#Ue?L&#)#v=6&;{7D{=Yitch+6G!)~T;Ueh6jh5mUq3qT>uIF$hQ@Hc`;Oi;PNZhAb!k zLTN0AS91P|RtHw8&P7n>H=8Rjuy=kyBl*;j{ka<{$9KzpG}3%*O-0ePV38S~@zkd@ z(hr0(%if~%Ss{rf8&)8Dyqd|Ix6snM zAmZdIg1cDe&Id`(_0jrrB$K$Ba5dHeZ$k2p#@&4=|26D}Wh5slP(B$_5fl9bBBd+r zh-oqgN_)VnzFZ7zn+u&TR=Sc?Qwh z==*UH6G^3n6dPDuTkGx)x4D0_rvN=-X3Iqmo(jpK^p|L0ZkznMRkHfofWuJKuzbJl zbfHTT>4iP?X(`PYPkms5e<|DCG&!9glB|-E4GVc#B+PqL&%pq8!(E|~RERa>ovE;x z&l=}@`+-jWiTo0`^ws1WO&nX96F*+z0-E>8bG(fLrbRAXRYE#b%)QiJG41o=QTfh9 zq@gCwTtki8W$MFPTj_AR9_-i9(ty)zL;4q zh~}m~+g^+uRufa9(CXcZIjudKE|jF1HXEEK{6MT``{`cWcfll!@+z+9Q)wLQTWP5Ff09VbGk0h!*sHH< z7UvphbU*uMBu(5{f}%h;lz!dlJB8fbGF)hQxo5ICr6N_fI{M?v@py$Ds`I?H6*y7= zDzM*`l^YFs=ZBp1y?qJW3t!sep)V~u=~0nBu}i{pgVp0*>nOxLZN$OdJLhbz5REtL zD?I*2avbgf5t-7bJ>CxYy_0ZaYu^q+#>azy?byGn8#{i~eD2bRl(Fo>$NQ5XvC1e# zA7dgoPq(+5-d2AjJy_+S7pT#39{dVJT2{A~8pY9~4&*iLtH9S5^eX!_3XtmoqXO5Y zy5BgCmFD%k)>?mB-K~4VNAcfc`~Qqc5o5`Fr_b_9`N5l%dakjNamnZo`@^AEgxj7i zwgv|Jo;LYvn>wSgHskX;yy?8@w@p1QMcJ>SXA%1vXWsu&{Q13w60Dvn?Sq&hBq}Sl zV2P0F_;2&es_w7TPyrQ=5VsWj7U)i|`fv+jAoal8+lT(6PvD)dH&-k+dNSg-pnU&8N0ztEhI`Kzj5_KsEE4smY-Ir|qWsWL#J! z%OKyaZl8@0-%U`7Mge7oZmU~yhZ~AWXPzqI-I_=>SlOmSK>4+u^=Rf8O0dVPzR{Cr zEk6p%FM-u@FGW3T*1^*`)?reC64EUB3Eg}^AGxQFqP*1)pU%74po=yGmGjOej{5$yi76lNcxREwys0%~7G*t1hD< zVZoA0Xr@ZHwBd*3u4v+yvZu#wB4Pr5zRISu9cle$ewE6~+2=kJPI7_EDS4wQBEp>W zl&v=hVzWng-@SCV2)be+nk@3{jbmHs2*~HIn5!SUSjVkO}U>|{W_QxIKY5w;w z%kFX*Ze2=x3{4c(5p$PzlnmFEIWbL*5K5-S7kzkzB?u9JVK02;+MUcot9#0JF{<#j zVr2nMWP#t0y;qCweA>#U5oL6rJXqek>kLfah_O3N3nSxjIBCmCI;|WkE16_BqtvII zBsdy;Sh^nj?Z7Kc&wwHJz#uDZvAQgQ3O?QMO~TrtK#mJI9geUO2|{sv%Cg>2LOvg1 ztZ|*0^F#6kIrIC`*6P3ie#XnZp*B)#j0#wKXS5Nn&98yavw!F3{!xZNQd3a+Cjq_w2@OKLFFFtmw*`QU^H1*`wd~z5# zy0kYgvD;OR9snY!cbh#tJmlJ)N|JOJJsoeughSmMQq+^mTiWvN=Enrj{O*Bme%_-P zTF>EM9w~!EO`>EUC6FlJ3^@)zqz8?Siv5ep^qSq}TR)*LCApo^m*cQa<(q-#L=0ze2=c{A}{Q4cQmixtlI zhjqssZumG&%y!NQ1{u4~%_-h+ER0p)a}?4doi*7J!Y(z4{Y+%zV8aV5^!@5!AH74> z)a(zxJ;%mQ=%x`bNWfm~nV0e0oC_)~zLdFX#D6;c%2w%i#WWz6uEv|9>=X*Z#Ww5j zqeh_x`Et-J=bFsGlAaWZgQ`;e-AR824h(`W&aAePl3sh;JyX69NOko@-+kyakls9c zoQT6e9X_a=RXVJwCI{m<`9gOUVFHJivedmzH9i(fhnC_HX)y<+&lge&gkfRWx?wF?|>E^ikM*kV6^ z3Y<1wqC%HwNa@eRDL2mdVR3KBQCEa`Zl}Bd>6rV?kpP4W*H6OH=(+3)XxJd%xG(+N z-yI#+M*u(#zZ=%eepuu(9sdIm0&XMtuZ9@_%rBSk`wX-~ko{1DhphzmuK#h<8+FuB z;8W8Vd@hfp> zWzKyHR25-%Rf@MxCeGd2Ta1Ga{6RP78k=EQ%{2#u6`{3o8oBA&uW}?J!K%LMqErXg zvexQ0mJfdekBr4%3b?7-d?9z5^11TR?Lrut;%z6zS{-GR>%O!}WqQ*-$cT>D(|mgC zk6bEXw)sE>6C7JdZ1mI0eIes|3nMy_mP`TBLP+3E+vYstj~NeHr)PR0?;fS_(%HIf zDO73Uw2;P6X~Ub!MzOb>%_5Fm1;(x4-!9!1U1uZ)vrm(Lva}=(4Rmwc^?0ZkmJsX_ z$lI6GP3Pm4G|#kbe}280Tc%>Y*F2FPk)zfEz|7a-h&awE)q1S8oSQYw=_#&O5GDcG zjrvsU#5SqBnj7xlIGswMMg})a%hTqN7H132RqUxd;^j&GrXzaf;Le^Gg#_RTc{0a) zlta>Yp8oK35MS+W%%t4u$0Ywy;u5$RA2z-)@~7D-q)EyvHmhvf%1A6Avd4AoH|hg% zJCTEJBAIN83v{J!8Fr^>BxH@|Rq!wJGY$)iex_jU8y9FQ#^qfY)@}G0c>{uzZqe0; z7lpn}*ThorDDE~8rta3otII7t4G8v= z#%=vtx@EV*G2LnlI!+w!E!?aolo_5Cm`1TW~M}N{>&^oe&4T`StwX z!T)Y;J1&6osuZ_jPd_fI5?jV|yaC&OFCdbc7OX6PUrwkmGRxp;?L!N&9(|xhyzX_9 z2h6tNX3GEuBKVDDcdj$?i+H1?%@IY+0dbM)*qug+rto~UR`kL~eJ{Ct8F=I=#*Ca_ zQT2KtxGUP6|LxnisSZTxWv-Bb_l4f_;^i^y0$8~an7qkSWumTL^PD`qw602JQ;iZYbG!rgiCODP z*kEp~+iLF^$+bda`^GP+6NDb0d!)OxeaW%Q)3QzHs5ei^qjx7KR*YrFYOhQZ-MT-a zv6Wqa#Mf~N@bB2LHHs{qKEGTsdA*jbnO`||SE>taM)Py4w>5N4mba1%aBb+xhoA&t zYH*@a5t4iTT2e~V!4zk&0u;Vy4Ddk+xU)5&_`&3;oYP`Uy!F~le)gIZ;~k3wOVh@N z21z-&+Wy#deWc;Tg3zyWgL+6Yvf8`8x=hD}YR!Vp1+RJQ3Mhyy?rE6vhwn#cy)0=-(FoUijKVwdG0Y=>mub=aZiaR^67h| zl%hrbWZG!X6}cpLFAH$foWFT1Z|i2ScKrmKL2B-64#={ix#*I1(&(D8qp8_}hyS1% zxpbW-(iv^wxyosG@$TKbt*!EVPBNlckWWutITiaHa~E8i!_* zVC4kA9!yKSY`X`(W8)yq0#yT`c<5`h5}1USU!uXZW$C(#55 z9l-Hmx^o(oqM*|w(HZ@gl{Wq=wUiVs?)2w) zxtt0dk`<6EA2>NRm6G!z8-Mxz+FEHpaCjUqJ6!89B_Sv%2pmuQPPm&TBChW(S%IQ~ z{SX`259{tBRRMNZ;hy_Sa3ecUAof`=U>0-c5c|qXtF}JQkVf{Ht%VN>fS3R<66(_d z9PqPWE6{2Fw;3;@0E4L5R&vcb^I8DT7qu5rKM@5B0I@!q548mZOf=Cz@{p0f-;~!)S;Bjvz4Mg+OSDzqQ_7 zBKb?_0QMG{JXH=Jo+7QU*WN~L_bfFE!zs?|>i0eSuG^Tm^flff;>I48`^(^9gN*4M zDc?z?;$ZygLQTbpeZ96ZgPjIrX%{%J-2~D1f7t|JY<~sv-x1-mRh9m7PWlkqMlEr> zk{YZAzIjV;`jRtt$>RWA)Iu%IZXL1h=_>>%)z0|mM|g&&vQkP0?V4U0P33hnI8=sX04rst3hPcM3OiJ)y0SnkN&E-11xaBc@w%F25%E| zN>BQS@AJRnAOKJ8lBu&bRiD7BV2Vc@`VU}z$$%PS@Mz!s{+$fdB~cTtQ(qE&cJ>6F z06?3U1gJu#)_{}UlRZ0h`<2ACRIU1cUqFES*V!PGi|1!YR&7sfiN)e{_E+7?+0NWznB7;u;qhx=$P9nE%-2zry0K4)i=uQupNCcobnz7TH-hie0 zak{k6diLC;hUR}kaPgSOlK>|CeV zn4N8c2co^?fnhMMHXx9Mh5+O*_X*7N29GII^;{vvYDxfq5ciA5P591_cDv z$b#C{>1n|stBB4{6-%-!P-+gn`Pz%K?`@Za%wZ%oSUKSF!QLkYR#w&z^^;hNWdW!A zT4H^`qHuE3=z;|T?S11#Q6}*~0k%(FE|y~cC8M%__)0s|$?OG&)w!7_b09}U8z8HK NwAAjZRw&;O`5yq~zu*7> literal 0 HcmV?d00001 From 397e298a811afbde8a56ae9192fc11904dd13786 Mon Sep 17 00:00:00 2001 From: Brian Wren Date: Tue, 15 Oct 2024 21:28:25 -0700 Subject: [PATCH 2/4] link fixes --- .../container-insights-data-collection-configure.md | 2 +- .../containers/container-insights-transformations.md | 4 ++-- .../essentials/data-collection-endpoint-overview.md | 2 +- .../azure-monitor/essentials/data-collection-rule-samples.md | 2 +- .../essentials/data-collection-rule-structure.md | 4 ---- articles/azure-monitor/logs/create-custom-table.md | 2 +- articles/azure-monitor/logs/logs-ingestion-api-overview.md | 2 +- 7 files changed, 7 insertions(+), 11 deletions(-) diff --git a/articles/azure-monitor/containers/container-insights-data-collection-configure.md b/articles/azure-monitor/containers/container-insights-data-collection-configure.md index 4624cfb067..7021ab46f0 100644 --- a/articles/azure-monitor/containers/container-insights-data-collection-configure.md +++ b/articles/azure-monitor/containers/container-insights-data-collection-configure.md @@ -230,7 +230,7 @@ The settings for **collection frequency** and **namespace filtering** in the DCR When you specify the tables to collect using CLI or ARM, you specify a stream name that corresponds to a particular table in the Log Analytics workspace. The following table lists the stream name for each table. > [!NOTE] -> If you're familiar with the [structure of a data collection rule](../essentials/data-collection-rule-structure.md), the stream names in this table are specified in the [dataFlows](../essentials/data-collection-rule-structure.md#dataflows) section of the DCR. +> If you're familiar with the [structure of a data collection rule](../essentials/data-collection-rule-structure.md), the stream names in this table are specified in the [Data flows](../essentials/data-collection-rule-structure.md#data-flows) section of the DCR. | Stream | Container insights table | | --- | --- | diff --git a/articles/azure-monitor/containers/container-insights-transformations.md b/articles/azure-monitor/containers/container-insights-transformations.md index a1c3a4e23b..3edc9b42ac 100644 --- a/articles/azure-monitor/containers/container-insights-transformations.md +++ b/articles/azure-monitor/containers/container-insights-transformations.md @@ -24,7 +24,7 @@ Transformations are implemented in [data collection rules (DCRs)](../essentials/ ## Data sources -The [dataSources section of the DCR](../essentials/data-collection-rule-structure.md#datasources) defines the different types of incoming data that the DCR will process. For Container insights, this is the Container insights extension, which includes one or more predefined `streams` starting with the prefix *Microsoft-*. +The [Data sources section of the DCR](../essentials/data-collection-rule-structure.md#data-sources) defines the different types of incoming data that the DCR will process. For Container insights, this is the Container insights extension, which includes one or more predefined `streams` starting with the prefix *Microsoft-*. The list of Container insights streams in the DCR depends on the [Cost preset](container-insights-cost-config.md#cost-presets) that you selected for the cluster. If you collect all tables, the DCR will use the `Microsoft-ContainerInsights-Group-Default` stream, which is a group stream that includes all of the streams listed in [Stream values](container-insights-cost-config.md#stream-values). You must change this to individual streams if you're going to use a transformation. Any other cost preset settings will already use individual streams. @@ -55,7 +55,7 @@ The sample below shows the `Microsoft-ContainerInsights-Group-Default` stream. S ## Data flows -The [dataFlows section of the DCR](../essentials/data-collection-rule-structure.md#dataflows) matches streams with destinations that are defined in the `destinations` section of the DCR. Table names don't have to be specified for known streams if the data is being sent to the default table. The streams that don't require a transformation can be grouped together in a single entry that includes only the workspace destination. Each will be sent to its default table. +The [Data flows section of the DCR](../essentials/data-collection-rule-structure.md#data-flows) matches streams with destinations that are defined in the `destinations` section of the DCR. Table names don't have to be specified for known streams if the data is being sent to the default table. The streams that don't require a transformation can be grouped together in a single entry that includes only the workspace destination. Each will be sent to its default table. Create a separate entry for streams that require a transformation. This should include the workspace destination and the `transformKql` property. If you're sending data to an alternate table, then you need to include the `outputStream` property which specifies the name of the destination table. diff --git a/articles/azure-monitor/essentials/data-collection-endpoint-overview.md b/articles/azure-monitor/essentials/data-collection-endpoint-overview.md index f037a81936..b9056e2ded 100644 --- a/articles/azure-monitor/essentials/data-collection-endpoint-overview.md +++ b/articles/azure-monitor/essentials/data-collection-endpoint-overview.md @@ -15,7 +15,7 @@ ms.reviwer: nikeist A data collection endpoint (DCE) is a connection where data sources send collected data for processing and ingestion into Azure Monitor. This article provides an overview of data collection endpoints and explains how to create and set them up based on your deployment. ## When is a DCE required? -Prior to March 31, 2024, a DCE was required for all data collection scenarios using a DCR that required an endpoint. DCRs for supported scenarios created after this date include their own endpoints for logs and metrics. The URL for these endpoints can be found in the [`logsIngestion` and `metricsIngestion`](./data-collection-rule-structure.md#endpoints) properties of the DCR. These endpoints can be used instead of a DCE for any direct ingestion scenarios. +Prior to March 31, 2024, a DCE was required for all data collection scenarios using a DCR that required an endpoint. DCRs for supported scenarios created after this date include their own endpoints for logs and metrics. The URL for these endpoints can be found in the [`logsIngestion` and `metricsIngestion`](./data-collection-rule-structure.md#properties) properties of the DCR. These endpoints can be used instead of a DCE for any direct ingestion scenarios. Endpoints cannot be added to an existing DCR, but you can keep using any existing DCRs with existing DCEs. If you want to move to a DCR endpoint, then you must create a new DCR to replace the existing one. A DCR with endpoints can also use a DCE. In this case, you can choose whether to use the DCE or the DCR endpoints for each of the clients that use the DCR. diff --git a/articles/azure-monitor/essentials/data-collection-rule-samples.md b/articles/azure-monitor/essentials/data-collection-rule-samples.md index c17b202bbb..df7ae1677e 100644 --- a/articles/azure-monitor/essentials/data-collection-rule-samples.md +++ b/articles/azure-monitor/essentials/data-collection-rule-samples.md @@ -287,7 +287,7 @@ The sample [data collection rule](../essentials/data-collection-rule-overview.md - Applies a [transformation](../essentials//data-collection-transformations.md) to the incoming data. > [!NOTE] -> Logs ingestion API requires the [logsIngestion](../essentials/data-collection-rule-structure.md#endpoints) property which includes the URL of the endpoint. This property is added to the DCR after it's created. +> Logs ingestion API requires the [logsIngestion](../essentials/data-collection-rule-structure.md#properties) property which includes the URL of the endpoint. This property is added to the DCR after it's created. ```json { diff --git a/articles/azure-monitor/essentials/data-collection-rule-structure.md b/articles/azure-monitor/essentials/data-collection-rule-structure.md index 55d2ad3cc6..fdcc203f4c 100644 --- a/articles/azure-monitor/essentials/data-collection-rule-structure.md +++ b/articles/azure-monitor/essentials/data-collection-rule-structure.md @@ -154,10 +154,6 @@ Data flows match input streams with destinations. Each data source may optionall - - - - ## Next steps [Overview of data collection rules and methods for creating them](data-collection-rule-overview.md) diff --git a/articles/azure-monitor/logs/create-custom-table.md b/articles/azure-monitor/logs/create-custom-table.md index 514c7a3fef..3df2aa31cb 100644 --- a/articles/azure-monitor/logs/create-custom-table.md +++ b/articles/azure-monitor/logs/create-custom-table.md @@ -53,7 +53,7 @@ To create a custom table, you need: Azure tables have predefined schemas. To store log data in a different schema, use data collection rules to define how to collect, transform, and send the data to a custom table in your Log Analytics workspace. To create a custom table with the Auxiliary plan, see [Set up a table with the Auxiliary plan (Preview)](create-custom-table-auxiliary.md). > [!IMPORTANT] -> Custom tables have a suffix of **_CL**; for example, *tablename_CL*. The Azure portal adds the **_CL** suffix to the table name automatically. When you create a custom table using a different method, you need to add the **_CL** suffix yourself. The *tablename_CL* in the [DataFlows Streams](../essentials/data-collection-rule-structure.md#dataflows) properties in your data collection rules must match the *tablename_CL* name in the Log Analytics workspace. +> Custom tables have a suffix of **_CL**; for example, *tablename_CL*. The Azure portal adds the **_CL** suffix to the table name automatically. When you create a custom table using a different method, you need to add the **_CL** suffix yourself. The *tablename_CL* in the [DataFlows Streams](../essentials/data-collection-rule-structure.md#data-flows) properties in your data collection rules must match the *tablename_CL* name in the Log Analytics workspace. > [!WARNING] > Table names are used for billing purposes so they should not contain sensitive information. diff --git a/articles/azure-monitor/logs/logs-ingestion-api-overview.md b/articles/azure-monitor/logs/logs-ingestion-api-overview.md index f19d06a49b..7e2818b75c 100644 --- a/articles/azure-monitor/logs/logs-ingestion-api-overview.md +++ b/articles/azure-monitor/logs/logs-ingestion-api-overview.md @@ -60,7 +60,7 @@ The endpoint URI uses the following format, where the `Data Collection Endpoint` :::image type="content" source="media/logs-ingestion-api-overview/data-collection-rule-immutable-id.png" lightbox="media/logs-ingestion-api-overview/data-collection-rule-immutable-id.png" alt-text="Screenshot of a data collection rule showing the immutable ID."::: -`Stream Name` refers to the [stream](../essentials/data-collection-rule-structure.md#streamdeclarations) in the DCR that should handle the custom data. +`Stream Name` refers to the [stream](../essentials/data-collection-rule-structure.md#input-streams) in the DCR that should handle the custom data. ``` {Data Collection Endpoint URI}/dataCollectionRules/{DCR Immutable ID}/streams/{Stream Name}?api-version=2023-01-01 From 14771beffa1155520e5cd4b1fd23bb5a0aac7d29 Mon Sep 17 00:00:00 2001 From: Brian Wren Date: Wed, 16 Oct 2024 07:57:35 -0700 Subject: [PATCH 3/4] missing kinds --- .../essentials/data-collection-rule-structure.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/articles/azure-monitor/essentials/data-collection-rule-structure.md b/articles/azure-monitor/essentials/data-collection-rule-structure.md index fdcc203f4c..7cf8638c27 100644 --- a/articles/azure-monitor/essentials/data-collection-rule-structure.md +++ b/articles/azure-monitor/essentials/data-collection-rule-structure.md @@ -35,10 +35,10 @@ The following table lists the different kinds of DCRs and their details. | Kind | Description | |:---|:---| | `Direct` | Direct ingestion using Logs ingestion API. Endpoints are created for the DCR only if this kind value is used. | -| `AgentDirectToStore` | | -| `AgentSettings` | | +| `AgentDirectToStore` | Send collected data to Azure Storage and Event Hubs. | +| `AgentSettings` | Configure Azure Monitor agent parameters. | | `Linux` | Collect events and performance data from Linux machines. | -| `PlatformTelemetry` | | +| `PlatformTelemetry` | Export platform metrics. | | `Windows` | Collect events and performance data from Windows machines. | | `WorkspaceTransforms` | Workspace transformation DCR. This DCR doesn't include an input stream. | From d7c6e1cb549771c550fd8d8f4bff4fe95c53b795 Mon Sep 17 00:00:00 2001 From: Regan Downer Date: Wed, 16 Oct 2024 13:08:51 -0400 Subject: [PATCH 4/4] Typo --- .../azure-monitor/essentials/data-collection-rule-structure.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/articles/azure-monitor/essentials/data-collection-rule-structure.md b/articles/azure-monitor/essentials/data-collection-rule-structure.md index 7cf8638c27..eed539cd2a 100644 --- a/articles/azure-monitor/essentials/data-collection-rule-structure.md +++ b/articles/azure-monitor/essentials/data-collection-rule-structure.md @@ -61,7 +61,7 @@ The input stream section of a DCR defines the incoming data that's being collect | `dataSources` | Known type of data. This is often data processed by Azure Monitor agent and delivered to Azure Monitor using a known data type. | | `streamDeclarations` | Custom data that needs to be defined in the DCR. | -Data sent from the Logs ingestion API uses a `streamDeclaration` with the schema of the incoming data. This is because the the API sends custom data that can have any schema. +Data sent from the Logs ingestion API uses a `streamDeclaration` with the schema of the incoming data. This is because the API sends custom data that can have any schema. Text logs from AMA are an example of data collection that requires both `dataSources` and `streamDeclarations`. The data source includes the configuration