Gold Standards – Part 3

Gold Standards – Part 3

Engaging Data Explains :

Creating The Gold Standards in Data –

Part III


In this third part of our blog on gold standards, we’re going to assess the importance of processes in creating an ideal data environment. If you recall from previous parts of this blog, we have juxtaposed the process of dealing with data with the running of a cake shop. And when you’re developing the ideal processes for either type of business, simplicity is critically important.

But ‘process’ also refers to several aspects of the business. It can refer to the discussion of new orders with customers. It can be related to the process involved; the time it may take to bake and distribute cakes. It can also be related to the management of customer expectations and orders.

The process involved with the business can help your team understand what they do and how they fit in, as well as playing a vital role in deciphering and organising team responsibilities, interactions and achieving gold standards. It means that even if team members aren’t specialists in a particular area, they can still refer to established processes and be aware that they will result in a strong final product. For example, if a baker doesn’t know how to decorate a cake, they can understand that it can be decorated, and potentially the time that this will take. 


Understanding The Process

A good rule of thumb is to aim for the production team to understand enough about the process that they can make accurate estimates on production schedules. This demonstrates how much you can produce in one week if everyone works at 100% efficiency. And it also offers details of the effort needed to create the cake, and therefore the return or profit derived from the product. 

Once you have a good understanding of customer demand and production time, you can better manage cost and resourcing, potentially enabling you to offer alternatives to customers who have particular requirements. For example, customer demand may require the cake shop to produce 1,000 cakes in a week instead of just 100. By developing a process around the type of cake shop the establishment wishes to be in the local market, the orders can be successfully fulfilled.

In order to address this, the cake shop conducted some research on the market, reviewing the orders received over the last 12 months. Based on this data, they decided to optimise their operation so that they produced a consistent standard of high quality, mid-price range cakes using the best quality local produce. It is anticipated that this will constitute 80% of the company’s orders. Alongside this, some premium cakes will also be produced, with the intention of satisfying exclusive clientele, after the manager of the store noticed that this sort of cake tends to produce positive social media feedback.

With this in mind, the business decided to create a unified process, enabling her staff to process both types of orders using the same equipment. Different ingredients and alterations in the production process were required for the bakers to produce premium cakes, consequently more effort and attention to detail was needed. Furthermore, separate ingredient storage and labelling – to avoid premium ingredients being used within the standard cake – was also deemed essential.

Whether making cakes en masse to sell to local shops, or an artisan cake shop producing a range of bespoke, high quality and expensive cakes, each manufacturer will develop its own process to suit the desired output.

Questions you can ask yourself:

  • Do we understand what cake or range of cakes that we are trying to produce?
  • Can you easily explain your process to a customer, in terms they understand?
  • Is the manufacturing process suitable for the product/product range?
  • Do you understand the running costs involved with making more cakes?

Tools and Delivery

Producing gold standard cakes is not as simply as merely building a process or hiring the best people. In some cases, you can be limited by your tools. Realistically, there may be budgetary restraints that prevent you from purchasing the appropriate equipment for the job. So it can be best to utilise existing tools, but highlight them within the process as a potential risk to production. This will help with any customer discussions, particularly understanding any impact on production costs.

The new process at the cake shop enabled existing equipment, including the all-important ovens, to be fully utilised in the production of the desired range of cakes. As a result, business is soon booming and there is a large backlog of orders.

After assessing the production process and resources, the manager of the bakery decides that the most cost-effective way of increasing production is simply to add another oven within the production line. The additional oven enables the same staff and process to double the number of cakes that can be produced, while retaining the same processing time. The initial purchase price and ongoing maintenance costs are easily met by the increased revenue.

Questions you can ask yourself:

  • Do my tools enable me to produce our products effectively and efficiently?
  • What limitations do we have?
  • Do we utilise our tools correctly?
  • Do our production process utilise our tools in the best way possible?
  • Are there any other tools that can help me, either by saving resources or production time?

Time

Production time is often hidden from customers. All of your fantastic looking cakes in the shop window. or publicised on social media, take time to produce. Yet customers rarely appreciate the intricacy of the production process. Nor are they usually willing to wait! Therefore, managing expectations around deadlines from the very minute that any customer orders a cake will either address any concerns, which is why it’s always better to talk about this issue first.

Sometimes the thinking or desires of customers can change at the eleventh hour. On one occasion, the cake shop received a call from a customer late on Friday evening. At the last minute, Mr Jones had remembered that he needed to order a cake, so that it was ready for tomorrow’s big party. As one of the company’s bakers completed the order form, he realised that this order may not be possible, based on the existing orders that are already due to be baked tomorrow.  After a quick check with the kitchen staff, he confirms that the order cannot be satisfied, and subsequently informs Mr. Jones. The customer was disappointed, but understood after the kitchen capacity and pre-existing orders were explained to him.

Being transparent with customers and influencers about existing orders and production processes will help any business, but these processes will never stop unexpected orders from appearing at 8pm on a Friday night. However, it does create a consistent customer experience and help companies to mitigate any associated disappointment. Communication can be critical when all other approaches fall short.


Creating gold standards in data, or any business, is always dependent on efficient processes being put in place. These processes encompass every aspect of dealing with internal and external factors, and consequently it is critical for companies to continually refine every aspect of their operation.



The Gold Standard – Part 2

The Gold Standard – Part 2

Engaging Data Explains :

Creating The Gold Standards in Data –

Part II : Assessing the Gold


In this second part of our four-part series on gold standards in data, we’re going to examine the importance of instilling a review process.

Reviewing is never the most exciting nor relished procedure! It often feels like an unnecessary grind that slows down your whole operation. But it’s actually a critical part of any good business, enabling you to identify major faults before they become ingrained.


Reviewing Processes

We’ve already seen in the first part of our gold standards blog that the cake shop reviewed their existing processes before making a decision to change their order form. Often it’s only from self-assessing in this way that you can uncover new ways of working that enhance your productivity and efficiency.

It’s also important to emphasise that this needs to happen across the business. It’s normal to have a review as part of project governance for new projects, but it’s always worthwhile to reassess existing projects as well. They may be ‘good enough’, but they also might not be reaching the gold standard.

So in our cake shop scenario, how will the business assess the standard of the cakes being produced? Well, when the business reviewed the kitchen they found that each cook bakes one order, while also being responsible for checking their own cakes before they move to the decoration stage. This all seemed fine, but then the retailer received a few complaints about burnt edges and sub-par ingredients. Consequently, the cake shop reflected that its existing process of ‘marking your own homework’ was not sufficiently robust to identify problems.

In order to address this issue, several ways of reviewing the production of cakes were decided upon:

  • A simple visual inspection – does the cake look uncooked or overcooked?
  • A thorough test, such as breaking the centre of the cake to see if it is cooked. 
  • Pressing the centre of the cake to see if it springs back.
  • Hiring Paul Hollywood as a tester!

All of these checks are designed to see if the cake has been cooked satisfactorily. The shop decided to opt for all options, with the exception of hiring Paul Hollywood! Instead, each baker will review one another’s baking, with the hope that the business will grow to support a head baker who will review all cakes.

Just as the cake shop reviewed its processes to put a more stringent review structure in place, the same can also be implemented in a data-driven environment. The following questions are examples of some that you can ask yourself as part of this process:

  • Does the team need training?
  • Do you need to recruit new people with different skill sets?
  • Do the products need to change?
  • Is the supply line quick enough? 
  • Would more people or different processes help with efficiency?
  • Is the product still worth the effort that is invested in producing it?

Producing Gold

Once you’ve baked some quality cakes, you then need to take steps to market the product. It’s not enough to just produce the cakes and leave your customers to eat them if you want to maximise your marketing efforts. Building advocacy and influence via your customers is a great way of marketing your cakes to the right people. Word-of-mouth feedback from advocates is trusted by other potential customers, and is therefore far more effective than other forms of marketing.

However, there are two sides to the coin here. Negative feedback can be dangerous if it’s not managed effectively. Negative comments about the burnt edges of cakes will spread like wildfire to existing and prospective customers. But there are ways of recovering from this. Making courtesy calls to customers can provide you valuable insight into the process of ordering and consumption. 

Getting the right team in place, tailoring products for your target market, and taking feedback onboard in an active process are all important facets of contemporary marketing. 

What are the Considerations?

Gold standards always begin with what you are looking to deliver and who this will benefit. Gold standards should be designed to support outcomes, having considered both the internal and external factors that will influence design. Creating steps in the process to continually challenge the functionality of the end product and ensure that standards are still relevant to the end user should therefore be considered essential.

Sponsors and influencers can also play an extremely important role. Both can become prime advocates of your product, with the added benefit with sponsors that they pay you to advertise your goods or service!

Internal Considerations

Data

Data can be compared to ingredients within a cake. Naturally, good quality ingredients are critical to producing the best cake possible. The same applies to data. As we mentioned in part one, if you put rubbish data into your systems, you can expect rubbish outcomes!

At some point, the cake shop company realised that the bakers are not periodically reviewing the ingredients within their cupboards. To address this, the manager inspects everything that they have on hand, ensuring that any poor quality ingredients are replaced, and that anything out of data is thrown away. Labelling is updated, while processes are put in place to ensure that there is no repetition of these mistakes.

The key point here is that while complaints were registered about burnt edges, it may have been the ingredients that contributed to the final product that were the problem. Going forward, the team at the bakery put in place a series of key questions that would inform their processes in future baking:

  • Do we have enough data to make the size of cake required?
  • Are we getting our ingredients from the right suppliers? 
  • Does this product contain nuts?
  • Have I mixed sugar up with salt?
  • The milk smells as if it’s on the turn, should I use it?
  • Do we store the ingredients in the right place, in the correct containers? 

Resources and Teams

When baking your cake, you can select from many different types of bakers or specialist chefs to assist with the process. Or you may decide that you wish to train yourself, or an existing employee, so that they can handle the most challenging baking tasks.

In some cases, if the cake shop utilises industrial equipment, people who have been trained to use this equipment can be deployed, as opposed to bakers or specialist cake makers. Having the right team with the right skills, and/or the aptitude to learn them, can be critical to successfully achieving gold standards. Instilling this in your team culturally is critically important in providing direction to your whole operation.

Achieving this can be as simple as asking yourself the following questions:

  • Is the team right to build the end product? If not, what needs to change?
  • Is the team open to changing or evolving in order to improve the product or efficiency?
  • Do we have the right skills? If not, do we need to second or buy them in?

Company Culture

Finally, failing to understand the company culture will lead to failure. Gold standards must fit into the existing culture, or the direction the company is moving towards.

Understanding how your customers think, managing their expectations and developing a standard to consistently perform to those expectations is simple to conceive. However, the human element of this could result in you developing hundreds of different gold standards for multiple different customers. 

Important questions to ask yourself here:

  • Are the customers knowledgeable about your products? If not, can you educate them?
  • Do you share your practice? Would it help your customers to know what you do and how you do it?
  • Are there any expectations that you can manage? 
  • Are they any difficult expectations you have to work towards?

Implementing a gold standard for data may seem like an all-encompassing and intimidating goal. But it instead should be seen as a granular process. Breaking down the ingredients and individual components that collectively create gold standards is the best way to achieve this aim.



The Gold Standard – Part I

The Gold Standard – Part I

Engaging Data Explains :

Creating The Gold Standards in Data –

Part I


Achieving an excellent level of data architecture is far from easy. But it is certainly possible if you implement certain guiding principles. Central to this is the implementation of a gold standard benchmark, which can then underpin any effective data architecture operation.

However, a gold standard is not something that comes naturally. In our experience, it requires diligent thought, effort and openness to change.

So in this four-part blog, we’re going to discuss some of the considerations related to this important goal for many organisations.


Resource Considerations

When we think about creating engaging analytics or data platforms to shape the growth of an organisation, the focus is often on the finding tool capable of developing the solution and not the surrounding aspects. But all of the ingredients that go into the mix are critically important.

Imagine you’re the owner of a cake shop providing bespoke cakes for your customers. Your products have to be good enough to keep customers coming back, but they also have to retail at an attractive price point. This means that there are immediately resource considerations.

You may choose to focus on providing a premium product, creating high-quality goods for a premium price.  Alternatively, you may deliver a higher volume product, baking lots of different cakes on a larger scale, which are still of a good standard, but only suitable for a lower price point.

In order to make this decision, you need to understand the following:

  • Product – what we are providing, and the value that we create.
  • Place – the environment that makes it possible to create the necessary standard of products at a sustainable rate.
  • People and Process – the team, the processes and the delivery environment (the bakery, the storage, the front of house, stock control, delivery, billing, etc.) that produce and maintain the consistent quality of product and experience.

If you can put all this together then you have the beginning of a gold standard in cake production. By the same token, in our field Engaging Data helps companies to review all of the data elements supporting such a setup, combining this with their aspirations to form bespoke gold standards. This enables our clients to achieve profitability and success.

Controlling The Input

Controlling input is a critical component of processing gold standard products. These vary, depending on what you are trying to produce, but examples include:

  • Requirement gathering.
  • Sources of data.
  • Quality of data.

Controlling inputs and creating quality is critically important, as if you put terrible into a system then the ultimate outcome will be a terrible product! Thus, you need to understand the requirements of your customers. In the cake shop example, this would mean knowing what type of cakes your customers desire, the toppings needed, the date and time of delivery, any dietary requirements, and so on.

The good news is that controls can be quite straightforward. They can be something as simple as checking data. So in a cake shop, it’s vital to confirm the direct requirements of your customers, noting down all relevant information. This can make the vital difference between providing the ideal products for your customers, or producing something that seems excellent, but is rendered useless or sub-par by one important constituent. For example, you might produce a cake for someone with allergy needs that is simply inedible from their perspective.

Quality control can be achieved by creating a simple order form. For example, a cake shop might include:

  • All vital information being distilled into yes/no questions – eg. “should cake contain nuts?”.
  • Ensuring that all product types are selected, and that nothing out of the unusual is ordered.
  • Product limitations being noted expressly on the form – acting as a reminder and preventing incorrect ordering.

Such a review process ensures that information is gathered correctly, and creates a collective responsibility for discerning the appropriate information. Important questions that you can ask yourself in a data environment to acquire such critical information include the following:

  • How will the requirements come into the team? 
  • How do we need to record them?
  • Do we have the right tools to collect the data?
  • How will we handle data quality?

Output Consistency

The output is the result of your efforts, so you have an innate interest in ensuring that it’s the best possible product. In common with the input, it is important to understand what you can control to reduce risk, as this can have a big impact on your output.

Central to this process is building systems and controls that enable you to monitor outputs. This in turn makes it possible to assess if they need to be altered in any way. This means that in a cake shop, you may consider the impact that each of the following areas has on the supply chain of cakes:

  • Production Team (bakers, shop front, etc.).
  • Ensuring similar standards and experience.
  • Providing the same customer experience.
  • Ensuring knowledgeability about the production processes, industry and competitors.

Each aspect of the order and production process also needs to be assessed and standardised:

  • Enjoyable and consistent ordering experience.
  • Stock control to manage high-quality ingredients. 
  • Quality control of all products.
  • Meeting all food hygiene regulations with a 5-star rating.

And then the tools of the trade should also be taken into consideration, as part of an ongoing auditing process. Central to this is ensuring that any equipment being used is within acceptable operational parameters, particularly not being overloaded or overstretched in any way.

So when you’re working in a data environment, or any working context, if you want to create gold standards then it’s important to continually monitor and challenge your processes. Ask yourself questions continually, such as:

  • Do we have the right team in place?
  • Do we understand what standard of products that we need to create?
  • Do we have processes in place that enable us to produce quality products?

This is just the beginning of our insight into creating gold standards with data, so in our next blog we will move on to discuss several other important factors.



Peer Review

Peer Review

Engaging Data Explains :

Peer Review


One of the most useful aspects of Power BI is the ability to achieve Self Service business intelligence. This is achieved with the Power BI Reporting Server. So in this blog post, we’re going to discuss how this is set up, along with some of the important things that you need to know about this system.

There are two important facets of this process that critical to understand. Firstly, you can achieve everything that we will discuss in this blog with the free version of Power BI. And, secondly, this is the ideal way to facilitate a move to the cloud; a hugely valuable process for virtually all companies.


Keep Data Masking Simple

We often encounter clients who have data masking requirements, and these can vary quite considerably. Recently, one such client has a particularly interesting business need; they needed to mask their production data, but it was also vital for them to retain human-readability, so that their testing team can utilise reporting and their internal systems.

This was proving challenging for the clients, as its core system did not include the ability to mask the data, and so they had no internal solution. On top of this central issue, the system that the company had built was also extremely complex, having been built upon over 20 years in their industry.

And another issue was that there wasn’t a huge amount of documentation included with the system to help either the client or ourselves understand it, and how their data is stored. Furthermore, they didn’t have enough storage space to hose a second database with production-grade, masked data.

This is not an uncommon scenario. We deal with quite a lot of clients, and many of them have complex, nuanced or specific requirements. Often they will need their data masking quickly, and frequently different types of masking are needed – obscured, human readable and randomised being just some of the requested requirements. And many clients also have limitations on storage and infrastructure that make the whole process more complicated.

Fortunately, the experience that we have accumulated means that we are able to deal with this multitude of different requirements, and deliver whatever a client needs.

The Body

If you have a small team, or release a large number of objects in each release, the time and resources needed to conduct a good peer review can have a negative impact. This process can potentially detract from development activities, delay projects, or ensure that the peer reviewers are forced to work ungodly hours!

Our app can often be the solution to this problem, particularly as it can be used in several different ways. For example, the app can be utilised to target specific release folders, or for the entire repository. The output of this process can then be emailed or summarised in a presentation tool, such as Power BI or QlikView/QlikSense.

The frequency of these checks can be user-defined, so that any schedule is possible – on-demand, hourly, daily, weekly and monthly are some of the most common choices. We have found that end of day reports are particularly useful, as these provide the developers with a list of things to change and adapt to standards that ensures that their day can become productive almost immediately.

The app also handles exceptions with aplomb, featuring the ability to flag anything that meets with your exception list. This helps to keep you abreast of anything that may not completely meet with your standards, but is acceptable for the time being.

The Tail

When you’re dealing with this issue, it’s important to nurture good developer habits. Once you have built consistent code, developed with an approach that works for your company, then the Power BI app can be integrated into your development lifecycle, helping you to monitor and educate your team. This is particularly useful if you have a high turnover of staff, or use third parties to supplement your development resources.

On one occasion, we deployed and tailored the app for one of our clients, whose main objective was to ensure that all contractors developed consistently to their standards. This involved daily checks on the code, which were summarised and sent to the whole team for action. The focus was on reducing the time taken to peer review by encouraging immediate on-point development. Developers are challenged to reduce their list to zero each day, although it’s not quite gamification just yet!

Conclusion

The more companies we liaise with, the more we encounter complex situations. Power BI is therefore extremely valuable.

This tool will not prevent you from having to peer review, but it will automate 90% of the job, allowing the peer reviewers to spend their time investigating code that would otherwise never be peer reviewed; much like seeking spelling mistakes via proofreading.


If you’d like to learn more about this app, or how we can help with your data project, please feel free to contact us.



Dealing with Data Masking

Dealing with Data Masking

Engaging Data Explains :

Dealing with Data Masking


The Power BI Reporting Server is a powerful tool, and one of its most intuitive features is the ability to achieve true self-service business intelligence.

So in this blog post we’re going to walk you through the process involved with Self Service BI. You can use the free version of Power BI in order to achieve this, so there is no barrier to entry.

One of the best things about Power BI is that it keeps data masking simple. Sometimes clients have complicated or very specific requirements, so it’s always important to make the process as straightforward as possible.


Unusual Requirements

For example, one client we were working with had an unusual and interesting data masking requirement; they needed to mask their production data, but ensure that it was also human-readable, so that their development and testing team could create a new client portal system. This would have been complicated enough in itself, but their existing platform was also extremely complex, while there was little documentation available to help them understand the data storage process.

And there was another problem as well. The company had insufficient storage space to hold a second (in-line) database, with the quality of production grade, masked data that was needed.

As we’re experienced and accomplished in this field, we quickly identified several possible solutions to this scenario. But choosing the best one was critically important. After some assessment, we elected for Redgates Data Masker for the following reasons:

• After a review of the underlying data structure, we reflected that it was too difficult, costly and time-intensive to attempt to transfer the data into the test environment and apply masking rules.

• It was important to make a distinction between masking and obscuring the data. The client wanted human-readable values, so we had to ensure this quality was retained.

• There was a lack of documentation regarding the location of personal identifiable data, which could result in the process missing an important part of the system.

• We also had a requirement to include a verification process, comparing the masked data against the source. This report would then provide us with an insight into whether we had inadvertently masked any records.

We devised a simple plan to extract the data, load it into a SQL database, and then finally complete the masking process. This would allow the client’s development team to export the masked data and transfer it into the client portal.

Technology

Identifying the data was always going to be a tricky process if attempted manually, due to the core system’s conventions around the naming of tables and columns. So to address this, we used WhereScape’s 3D product, which documented the structure of the system into a metadata layer. Our consultant worked closely with the business teams to update the metadata layer, highlighting the fields that contained personally identifiable data, while also adding business definitions.

We also took the opportunity to agree the type of data masking that was needed within the field. The most challenging aspect of this was understanding how the data joined or reused in different tables. But the client provided all of the known diagrams and suggested relationships, which significantly shortened the investigation time involved.

At the end of this exercise, our client also produced detailed documents of the core systems data structure, as well as analysis of the data cardinality/profiles. This uncovered some interesting points about the system, including some aspects of it that held personally identifiable data of which the client was unaware.

Using the information within the metadata, the physical structure of the system was imported into WhereScape’s Red product, which automated the extraction of data and loaded the data into a SQL database on a scheduled basis. We started off this process gently, working on a daily schedule, but as we became more certain about the process, we increased this duration to hourly.

Now that the data was present and optimised within the SQL database, we next used Redgate’s Data Masker to convert the personally identifiable data to a dataset, based on the agreed rules held within the metadata. Once the rules had been designed, WhereScape Red’s scheduler automated the masking, so that it began as soon as the loading had been completed.

What could have been a hugely complicated and onerous process was made far simpler. The whole database was copied, masked and sent to the client portal within four hours.

Measuring the Process

As some of the data was being sent to a third party, it was very important that there was never any risk of a data breach. But we had no problem in building a methodology to address this. Using WhereScape Red, Engaging Data was able to build a comparison process. This utilised the metadata, using only those fields marked as containing personally identifiable data. This also made it possible to compare the values before and after the process had taken place.

Finally, the comparison report was automatically emailed to the management team, regardless of whether or not a failure was triggered. This email contained a summary of field error analysis, as well as the number of field errors per record. The latter was used to assess the overall process and prevent any sensitive data from being distributed to third parties. By automating this, we were able to reassure the client that the whole process was working correctly.

Conclusion

It’s quite common for Engaging Data to encounter complicated situations with a wide range of clients. Each of the following are common problems or requirements:

• Ageing trading platforms/core systems, or sources of data that can’t utilise off-the-shelf data masking products.

• Companies need the data masked quickly, in virtual real-time speed.

• Different types of masking are commonly needed, whether obscured, human-readable or randomised,

• There are limitations on storage or infrastructure.

The best data masking tools will address these issues and automate the process, allowing the

client to decide how, when and where to mask. Our expertise and experience in this area has enabled us to achieve some excellent results with some highly complex datasets and requirements.


If you would like to learn more about this app or how we can help with your data project, please feel free to contact us.



How Jenkins Takes Wherescape to Another Level

How Jenkins Takes Wherescape to Another Level

Engaging Data Explains :

How Jenkins Takes Wherescape to Another Level


Data warehousing has grown in importance and popularity, as the global market for analytical systems continues to increase. The global market for data warehousing is expected to touch $30 billion by 2025, based on annual growth of around 12%. This led 76% of IT managers and executives to state that they are investing more in their analytics platforms when surveyed.

As more businesses use data warehouses, efficiency savings and improvements are expected going forward. Data automation is a concept that will benefit many companies, but it’s still important to choose the best solution.


Game-Changing Solution

That’s why using Jenkins to deploy Wherescape solutions is a game-changer. This integration tool used with Wherescape data warehouse automation software is rocket fuel for an already powerful package. 

With Jenkins, it’s possible for developers to build and test software projects continuously, thanks to actions built into the tool. This makes it easier for developers to integrate changes to any project, increasing flexibility in working practices. This can be hugely advantageous in the fast-moving contemporary data climate.

And this is just the beginning of the integration offered by Jenkins. The tool also makes it possible to integrate with other apps and software solutions, by installing plugins for the external tool – examples of this include Git and Powershell. There are over 1,000 plugins available for Jenkins, meaning that the platform supports the building and testing of virtually any WhereScape project.

Low-Maintenance Software

Another key advantage of Jenkins is its low-maintenance nature. The tool requires very little attention once it has been installed. However, when updates are required, the software includes a built-in GUI tool, ensuring that this process is as painless as possible.

Yet while it offers an ideal platform, Jenkins also benefits from continual improvement, thanks to its open-source nature. There is already an enthusiastic community contributing to the tweaking and evolution of the software, and this is expected to grow further still in the years to come.

Jenkins is a shining example of continuous integration, delivery and deployment, sometimes referred to as CI/CD. This approach to data warehousing means that code changes that translate into real-world improvements can be made more frequently and reliably, due to the automation of deployment steps.

Easy Plug-Ins

The process for plugging Jenkins into Wherescape begins with downloading the Java SE Development Kit, at which point you will also need to add JAVA_HOME to your environment variables. That is the only technical part; you then simply download Jenkins using the installer and follow the on-screen instructions. Before you can use the software, it will be necessary to create an admin username and password. Then you’re ready to go!

Among the palette of useful features included in the software is a list view of open projects, which provides an instantaneous insight into the status of everything important that you’re dealing with. This is the sort of feature that has ensured that as well as being powerful and flexible, Jenkins has also earned kudos in the data warehousing world for being user-friendly. 

Jenkins incorporates a user interface that is simple to pick up and navigate. There is a vast range of online tutorials available, while the active community that contributes to the tool is always on hand to offer assistance.

Configure and Customise

Another important aspect of Jenkins is the scope of configuration and customisation that it makes possible. Users can be managed by creating groups and roles, and this can all be handled elegantly via some straightforward menu prompts. Jobs can also be configured; for example, the tool enables them to be executed via timed intervals. 

Every aspect of the Jenkins software has been cultivated to ensure maximum functionality with minimum effort, yet enabling users to customise and monitor everything extensively at all stages of the process. You can even set up automatic email notifications, ensuring that everyone involved with a data warehousing project is kept in the loop.

At a time when the amount of information that companies deal with is escalating rapidly, data warehousing is becoming increasingly important. It’s simply not possible to ignore big data any longer; this is tantamount to being left behind by your competitors. Jenkins & WhereScape is an elegant data warehousing solution that has helped a multitude of businesses get to grips with their data warehousing requirements, without investing a huge amount of effort in training, onboarding, or hiring experts.

Wherescape was already a market-leader in its field, but with the advent of CI/CD tools such as Jenkins, this top solution just became even more compelling.