Category: Uncategorized

Sending Notifications to Teams from WhereScape RED

Jan 7, 2022 by Simon Meacher

In our Video, we discuss : How to Send Your Scheduler Notifications to Microsoft Teams using a WebHook via PowerShell.

Find out more here:

Here is an example of the code used to send a notification:

param ($SequenceNumber, $NotificationMessage)

$IncidentStartTime = Get-Date -Format "dddd dd/MM/yyyy HH:mm"
$IncidentServiceName = "P & L Server"
$IncidentMessage = $NotificationMessage

[string]$username = Get-Content C:\Temp\user.txt
[SecureString]$securePwd = Get-Content C:\Temp\password.txt | ConvertTo-SecureString -Key (Get-Content C:\Temp\aes.key) 
$psCred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList "$username", $securePwd

$adminURL = Get-Content C:\Temp\TenantURL.txt
$siteURL = Get-Content C:\Temp\TeamURL.txt 

[string]$webhookURL = Get-Content C:\Temp\WebhookURL.txt

Connect-PnPOnline -Url $adminURL  -Credentials $psCred

[string]$Notification = @"
    {
        "@type": "MessageCard",
        "@context": "https://schema.org/extensions",
        "summary": "Data Warehouse Notification",
        "themeColor": "0072C6",
        "title": "Data Warehouse Notification",
         "sections": [{
            
                "facts": [
                    {
                        "name": "Job Sequence:",
                        "value": "SequenceNumber"
                    },
                    {
                        "name": "Date/Time:",
                        "value": "DATETIME"
                    },
                    {
                        "name": "Job Name:",
                        "value": "SERVICENAME"
                    },
                    {
                        "name": "Description:",
                        "value": "MESSAGE"
                    },
                    {
                        "name": "HTML Table:",
                        "value": "MESSAG1"
                    }]  
                    ,"text": "Data Warehouse Notification Update"
                
            }]
    }
"@

#Create Variables to insert values into the JSON payload.

[string]$NotificationBody = $Notification.Replace("SequenceNumber","$SequenceNumber").Replace("DATETIME","$IncidentStartTime").Replace("SERVICENAME","$IncidentServiceName").Replace("MESSAGE","$IncidentMessage").Replace("MESSAG1","$HTMLRowData")

Invoke-RestMethod -Uri $webhookURL -Method Post -Body $NotificationBody -ContentType 'application/json'

We would love to hear how you have used or adapted the code for your own purposes!


Contact Us

* indicates required

Gold Standards – Part 3

Jan 4, 2022 by Tyler Bodys

Engaging Data Explains :

Creating The Gold Standards in Data –

Part III


In this third part of our blog on gold standards, we’re going to assess the importance of processes in creating an ideal data environment. If you recall from previous parts of this blog, we have juxtaposed the process of dealing with data with the running of a cake shop. And when you’re developing the ideal processes for either type of business, simplicity is critically important.

But ‘process’ also refers to several aspects of the business. It can refer to the discussion of new orders with customers. It can be related to the process involved; the time it may take to bake and distribute cakes. It can also be related to the management of customer expectations and orders.

The process involved with the business can help your team understand what they do and how they fit in, as well as playing a vital role in deciphering and organising team responsibilities, interactions and achieving gold standards. It means that even if team members aren’t specialists in a particular area, they can still refer to established processes and be aware that they will result in a strong final product. For example, if a baker doesn’t know how to decorate a cake, they can understand that it can be decorated, and potentially the time that this will take. 


Understanding The Process

A good rule of thumb is to aim for the production team to understand enough about the process that they can make accurate estimates on production schedules. This demonstrates how much you can produce in one week if everyone works at 100% efficiency. And it also offers details of the effort needed to create the cake, and therefore the return or profit derived from the product. 

Once you have a good understanding of customer demand and production time, you can better manage cost and resourcing, potentially enabling you to offer alternatives to customers who have particular requirements. For example, customer demand may require the cake shop to produce 1,000 cakes in a week instead of just 100. By developing a process around the type of cake shop the establishment wishes to be in the local market, the orders can be successfully fulfilled.

In order to address this, the cake shop conducted some research on the market, reviewing the orders received over the last 12 months. Based on this data, they decided to optimise their operation so that they produced a consistent standard of high quality, mid-price range cakes using the best quality local produce. It is anticipated that this will constitute 80% of the company’s orders. Alongside this, some premium cakes will also be produced, with the intention of satisfying exclusive clientele, after the manager of the store noticed that this sort of cake tends to produce positive social media feedback.

With this in mind, the business decided to create a unified process, enabling her staff to process both types of orders using the same equipment. Different ingredients and alterations in the production process were required for the bakers to produce premium cakes, consequently more effort and attention to detail was needed. Furthermore, separate ingredient storage and labelling – to avoid premium ingredients being used within the standard cake – was also deemed essential.

Whether making cakes en masse to sell to local shops, or an artisan cake shop producing a range of bespoke, high quality and expensive cakes, each manufacturer will develop its own process to suit the desired output.

Questions you can ask yourself:

  • Do we understand what cake or range of cakes that we are trying to produce?
  • Can you easily explain your process to a customer, in terms they understand?
  • Is the manufacturing process suitable for the product/product range?
  • Do you understand the running costs involved with making more cakes?

Tools and Delivery

Producing gold standard cakes is not as simply as merely building a process or hiring the best people. In some cases, you can be limited by your tools. Realistically, there may be budgetary restraints that prevent you from purchasing the appropriate equipment for the job. So it can be best to utilise existing tools, but highlight them within the process as a potential risk to production. This will help with any customer discussions, particularly understanding any impact on production costs.

The new process at the cake shop enabled existing equipment, including the all-important ovens, to be fully utilised in the production of the desired range of cakes. As a result, business is soon booming and there is a large backlog of orders.

After assessing the production process and resources, the manager of the bakery decides that the most cost-effective way of increasing production is simply to add another oven within the production line. The additional oven enables the same staff and process to double the number of cakes that can be produced, while retaining the same processing time. The initial purchase price and ongoing maintenance costs are easily met by the increased revenue.

Questions you can ask yourself:

  • Do my tools enable me to produce our products effectively and efficiently?
  • What limitations do we have?
  • Do we utilise our tools correctly?
  • Do our production process utilise our tools in the best way possible?
  • Are there any other tools that can help me, either by saving resources or production time?

Time

Production time is often hidden from customers. All of your fantastic looking cakes in the shop window. or publicised on social media, take time to produce. Yet customers rarely appreciate the intricacy of the production process. Nor are they usually willing to wait! Therefore, managing expectations around deadlines from the very minute that any customer orders a cake will either address any concerns, which is why it’s always better to talk about this issue first.

Sometimes the thinking or desires of customers can change at the eleventh hour. On one occasion, the cake shop received a call from a customer late on Friday evening. At the last minute, Mr Jones had remembered that he needed to order a cake, so that it was ready for tomorrow’s big party. As one of the company’s bakers completed the order form, he realised that this order may not be possible, based on the existing orders that are already due to be baked tomorrow.  After a quick check with the kitchen staff, he confirms that the order cannot be satisfied, and subsequently informs Mr. Jones. The customer was disappointed, but understood after the kitchen capacity and pre-existing orders were explained to him.

Being transparent with customers and influencers about existing orders and production processes will help any business, but these processes will never stop unexpected orders from appearing at 8pm on a Friday night. However, it does create a consistent customer experience and help companies to mitigate any associated disappointment. Communication can be critical when all other approaches fall short.


Creating gold standards in data, or any business, is always dependent on efficient processes being put in place. These processes encompass every aspect of dealing with internal and external factors, and consequently it is critical for companies to continually refine every aspect of their operation.


Contact Us

* indicates required

The Gold Standard – Part I

Dec 7, 2021 by Tyler Bodys

Engaging Data Explains :

Creating The Gold Standards in Data –

Part I


Achieving an excellent level of data architecture is far from easy. But it is certainly possible if you implement certain guiding principles. Central to this is the implementation of a gold standard benchmark, which can then underpin any effective data architecture operation.

However, a gold standard is not something that comes naturally. In our experience, it requires diligent thought, effort and openness to change.

So in this four-part blog, we’re going to discuss some of the considerations related to this important goal for many organisations.


Resource Considerations

When we think about creating engaging analytics or data platforms to shape the growth of an organisation, the focus is often on the finding tool capable of developing the solution and not the surrounding aspects. But all of the ingredients that go into the mix are critically important.

Imagine you’re the owner of a cake shop providing bespoke cakes for your customers. Your products have to be good enough to keep customers coming back, but they also have to retail at an attractive price point. This means that there are immediately resource considerations.

You may choose to focus on providing a premium product, creating high-quality goods for a premium price.  Alternatively, you may deliver a higher volume product, baking lots of different cakes on a larger scale, which are still of a good standard, but only suitable for a lower price point.

In order to make this decision, you need to understand the following:

  • Product – what we are providing, and the value that we create.
  • Place – the environment that makes it possible to create the necessary standard of products at a sustainable rate.
  • People and Process – the team, the processes and the delivery environment (the bakery, the storage, the front of house, stock control, delivery, billing, etc.) that produce and maintain the consistent quality of product and experience.

If you can put all this together then you have the beginning of a gold standard in cake production. By the same token, in our field Engaging Data helps companies to review all of the data elements supporting such a setup, combining this with their aspirations to form bespoke gold standards. This enables our clients to achieve profitability and success.

Controlling The Input

Controlling input is a critical component of processing gold standard products. These vary, depending on what you are trying to produce, but examples include:

  • Requirement gathering.
  • Sources of data.
  • Quality of data.

Controlling inputs and creating quality is critically important, as if you put terrible into a system then the ultimate outcome will be a terrible product! Thus, you need to understand the requirements of your customers. In the cake shop example, this would mean knowing what type of cakes your customers desire, the toppings needed, the date and time of delivery, any dietary requirements, and so on.

The good news is that controls can be quite straightforward. They can be something as simple as checking data. So in a cake shop, it’s vital to confirm the direct requirements of your customers, noting down all relevant information. This can make the vital difference between providing the ideal products for your customers, or producing something that seems excellent, but is rendered useless or sub-par by one important constituent. For example, you might produce a cake for someone with allergy needs that is simply inedible from their perspective.

Quality control can be achieved by creating a simple order form. For example, a cake shop might include:

  • All vital information being distilled into yes/no questions – eg. “should cake contain nuts?”.
  • Ensuring that all product types are selected, and that nothing out of the unusual is ordered.
  • Product limitations being noted expressly on the form – acting as a reminder and preventing incorrect ordering.

Such a review process ensures that information is gathered correctly, and creates a collective responsibility for discerning the appropriate information. Important questions that you can ask yourself in a data environment to acquire such critical information include the following:

  • How will the requirements come into the team? 
  • How do we need to record them?
  • Do we have the right tools to collect the data?
  • How will we handle data quality?

Output Consistency

The output is the result of your efforts, so you have an innate interest in ensuring that it’s the best possible product. In common with the input, it is important to understand what you can control to reduce risk, as this can have a big impact on your output.

Central to this process is building systems and controls that enable you to monitor outputs. This in turn makes it possible to assess if they need to be altered in any way. This means that in a cake shop, you may consider the impact that each of the following areas has on the supply chain of cakes:

  • Production Team (bakers, shop front, etc.).
  • Ensuring similar standards and experience.
  • Providing the same customer experience.
  • Ensuring knowledgeability about the production processes, industry and competitors.

Each aspect of the order and production process also needs to be assessed and standardised:

  • Enjoyable and consistent ordering experience.
  • Stock control to manage high-quality ingredients. 
  • Quality control of all products.
  • Meeting all food hygiene regulations with a 5-star rating.

And then the tools of the trade should also be taken into consideration, as part of an ongoing auditing process. Central to this is ensuring that any equipment being used is within acceptable operational parameters, particularly not being overloaded or overstretched in any way.

So when you’re working in a data environment, or any working context, if you want to create gold standards then it’s important to continually monitor and challenge your processes. Ask yourself questions continually, such as:

  • Do we have the right team in place?
  • Do we understand what standard of products that we need to create?
  • Do we have processes in place that enable us to produce quality products?

This is just the beginning of our insight into creating gold standards with data, so in our next blog we will move on to discuss several other important factors.


Contact Us

* indicates required

Self-Service BI

Oct 12, 2021 by Tyler Bodys

Engaging Data Explains :

Self-Service BI


Self-service business intelligence is making a huge difference to companies across a variety of sectors, by helping to optimise data analysis.

However, some businesses perceive that implementing a business intelligence approach can be challenging, due to various barriers to entry issues. This is something of a false impression, but this is why it’s important for business intelligence to explore data using simple tools, while enabling questions to be answered quickly.


Dealing with Data Management

Many traditional data teams build pipelines of data in order to deal with their data management procedures. But the approach utilised is not always ideal. It’s common for data engineers and analysts to explore data and build solutions with specific results in mind, which is rather putting the cart before the horse. More logical would be to make the data generally available and appropriately linked. This enables end-users to more readily explore the data, drawing more nuanced conclusions from the assembled information.

Truly valuable business intelligence should always be an end-to-end iterative process. Business teams need relevant, timely data in order to uncover accurate insights. Deploying platforms to achieve this is merely one crucial component of business intelligence. 

As you develop your business intelligence approach, it’s important to understand that self-service business intelligence doesn’t require everyone in an organisation to train as business analysts. Nor should it mean removing responsibility from your IT department. Instead, it’s about encouraging and educating your business teams to understand and interact with the data they generate throughout their work – creating a joined-up process across your organisation.

Using Power BI

One of the best ways to achieve this is via Power BI. This cloud package has been on a long journey since it was first released, but is now the ideal app to ensure that companies can engage with data safely. Probably the most valuable aspect of this package is that it provides an incredible amount of options and guided analysis, making your business intelligence process much more flexible.

Did you know, you can achieve 90% of this using Power BI Report Server – AKA the on-premise version of Power BI. 

Using Power BI Report Server prompted an excellent reaction from end-users, who were excited at their ability to slice data and retrieve answers to client queries rapidly. This is one of the big advantages of business intelligence; it enables companies to get to the core of what drives customer demand quickly. You are accelerating the ability of clients to use data to solve critical business problems.

Dealing with self-service business intelligence can be intimidating, though. This is particularly true if an organisation requires this to be achieved without using the cloud, or spending a significant amount of money on new software. Historically, this form of business intelligence has posed problems with identifying required data, such as reports, with little or no description or business knowledge behind the purpose of the reports. For this reason, self-service business intelligence has been known to reduce IT departments to cold sweats!

In early experiences of self-service business intelligence, there was often no gold standard with design. Imagine – using SSRS where there could be hundreds of folders and sub-folders, with dozens of data connections, generating thousands of reports. And then every report featured a different format. Understanding how each report was being used was nigh-on impossible, and general data analysis was far from effective.

Range of Options

But things are changing. New self-service business intelligence solutions offer clients a range of options, and enable data to be provided safely and in a controlled format. And there are several options available for companies, with decentralised, centralised, and hybrid approaches all possible. These approaches will be selected depending on the demographics of IT teams, with the level of governance and control involved having a major influence.

The great thing about Power BI desktop is that it is an adaptable tool, enabling users to get started with data transformation quickly. Most experienced Excel users can quickly get to grips with this innovation, meaning that it can be easily and widely implemented across an organisation.

The tool itself allows IT to extract the data transformation and, if required, reverse engineer it back into a data warehouse. Providing the business users with a tool that acts like a Rosetta stone between business and IT!

It’s important to emphasise, though, that regardless of the technology you always need to understand who you are catering for with data. With this in mind, there are several ways that you can delegate the permissions, rules, and responsibilities with Power BI, as this adds to the flexibility of the platform. Power users are also important, as these credentialed individuals utilise their experience with Excel to produce reports for an organisation.

As you develop your self-service business intelligence strategy, it is vital to implement proper governance. This will help you to avoid creating data silos, data sprawl, poor performance, and lax security. Unless you implement appropriate governance, business users will have unrestricted access to source systems and Power BI folders. This can lead to inappropriate sharing of data.

Key Factors

It’s also important to consider the following factors:

  • Service size and data storage – you do not have unlimited resources with Power BI Report Server, and you could therefore experience larger datasets consuming a significant portion of resources, impacting the performance of the entire service.
  • Risks to source systems – allowing users to connect with any source system or raw files can create problems.
  • Access and permissions – security must always be taken into consideration. Failing to pay proper attention to access and permissions can result in numerous ad-hoc groups being created, which can then be problematic. Controlling groups, and tweaking which users can be added into each AD group, is definitely advisable.
  • Many versions of the truth – if you have several different people involved to create the same report, you’re likely to get numerous different answers. This is why it is important for data warehouses to be populated, effectively creating a single source of universal truth.
  • Reduced audit and tracking – if end-users fail to provide adequate details or purposeful dashboards, the ‘who, what & why’ regarding the purpose of the report is lost, undermining the whole process.

Summary

Implementing Self-Service Business Intelligence has become far more feasible, but it’s still important to impose some control over how your services are being utilised, in order to generate the maximum and most accurate insight. Power BI Report Server can be an excellent tool in enhancing business intelligence, and definitely one we recommend for clients who are reliant on data.


If you are interested in Self Service BI, book a call us and find out how it could help your team.


Contact Us

* indicates required