Yes. We are planning on offering a secure service where you can publish your data to the cloud and access your dashboards online. This will include a web based portal and an app with push notification support.
Your database connection string is stored in your appsettings.json file.
{
"ConnectionStrings": {
"DefaultConnection": "Server=localhost;Database=DataBus;Trusted_Connection=True;MultipleActiveResultSets=true",
},
"DataBusConfig": {
"ConfigManager": "EntityFrameworkConfig"
},
}
You can also change you configuration type between EntityFrameworkConfig, FileConfig or MongoDBConfig
"Logging": {
"PathFormat": "Logs/indicium.databus-{Date}.log",
"IncludeScopes": false,
"Debug": {
"LogLevel": {
"Default": "Information"
}
},
"Console": {
"LogLevel": {
"Default": "Information"
}
},
"LogLevel": {
"Default": "Information",
"System": "Information",
"Microsoft": "Warning",
"Microsoft.EntityFrameworkCore.Database.Command": "Warning",
"Microsoft.EntityFrameworkCore.Infrastructure": "Warning",
"Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker": "Warning",
"Microsoft.AspNetCore.Mvc.Internal.ObjectResultExecutor": "Warning",
"Microsoft.AspNetCore.Hosting.Internal.WebHost": "Warning",
"Microsoft.AspNetCore.Authentication.JwtBearer.JwtBearerHandler": "Warning",
"Microsoft.AspNetCore.Authorization.DefaultAuthorizationService": "Warning"
}
},
By using the Web Feed plugin, you can obtain data from a service that publishes their data on a uri, then use the 'New Data Received Event' to process that data and use the JSON Extract Plugin to extract the individual items from the JSON in the Pipeline.
This example processes weather data from the Australian Government Bureau of Meteorology: http://www.bom.gov.au/fwo/IDT60801/IDT60801.94975.json
import clr
from System import DateTime,String,Math,Convert,Globalization
from Indicium.DataBus.Common.Data import *
clr.AddReference('NewtonSoft.Json')
from Newtonsoft.Json import *
class Automation:
def __init__(self):
self.heartbeat = 0
def newData(self, data):
self.heartbeat += 1
seriesEvent = SeriesEvent()
seriesEvent.Uri = data.Uri
seriesEvent.Name = data.Name
wJson = data.Point.ParseJson()
dataArr = wJson.observations.data
for item in dataArr:
weather = Linq.JObject()
weather.temp = item.apparent_t
weather.cloud = item.cloud
try:
dateT = dateT = DateTime.ParseExact(item.aifstime_utc.ToString(),'yyyyMMddhhmmss',Globalization.CultureInfo.CurrentCulture)
tVal = TimeValue.Create(dateT,weather)
seriesEvent.Values.Add(tVal)
except Exception as ex:
weather.dt = item.aifstime_utc.ToString()
return seriesEvent
Here is another example where the url is a csv file. Use this in conjuction with the CsvExtractPlugin to get values from the individual columns in your Pipeline.
import csv
from StringIO import StringIO
from System import DateTime,String,Math,Convert,Globalization, DateTimeOffset, TimeZoneInfo
from Indicium.DataBus.Common.Data import *
class Automation:
def newData(self, data):
seriesEvent = SeriesEvent()
seriesEvent.Uri = data.Uri
seriesEvent.Name = data.Name
stringCsv = StringIO(data.Point.Value)
sReader = csv.reader(stringCsv,delimiter=',')
counter = 0
for row in sReader:
counter += 1
if counter > 1: #Skip header row
try:
dt = DateTimeOffset.Parse(row[1])
tVal = TimeValue.Create(dt, ",".join(row))
seriesEvent.Values.Add(tVal)
except Exception as ex:
exc = ex
return seriesEvent