Shades of green screen modernization – part 1

GreenScreensHave you ever thought about the different shades of green (screen) modernization?

Over the last several year or even more than a decade their have been multiple approaches on how to modernize character-based terminal screens (3270, 5250, 9750 and other terminal protocols) providing access to mainframe or legacy applications. Terms like Web-to-host, Screen-scraping, Web-enablement or others have been invented around this issue of making old-looking screens more modern. Several people I have talked to over the last several years have mixed experience with those kind of products or even have developed their own tools to address this issue – there is pretty much no grew-zone experience, the people either like it or hate it. But let us do a reset of our mindset for now and let us have a neutral look on where we are today with these green screen modernization technologies.

Before considering this technology you may have to make a decision what kind of user or business problem you need to address, what kind of architecture you want to follow and what will be the effort of achieving this.

If you are not planning to redevelop or change your application code then the following options are available.

  • Use your terminal emulation capabilities to use the green screens out of a Web browser-based environment with basic level “beautification”, i.e. different fonts, colors, function keys. It is pretty much a one-to-one mapping of a screen to a little bit nicer screen.
  • Stay with the green screen, but smartly wrap it to improve usability and accessibility, e.g. running in a Web browser enriched with HTML and Javascript capabilities. This transformation can be simple from modernizing just one screen to complex screen aggregations, where one Web page manages multiple green screen sessions in the background.
  • Functionally wrap the green screen and expose the data and business logic as services (Web Services, REST, Java or .NET) and integrate these services into Web or desktop applications.

You should not cut out these capabilities from your portfolio of solving a problem, as many of them can provide a very good ROI by addressing specific business requirements rapidly and without major impact to your system (non-invasive).

In the next series I am going to explain the pros and cons of these green screen modernization approaches.

The temperature of big data

BigData-TemperatureHave you ever considered the fact that the availability of big data might also determines the way on you process the data immediately and afterwards?

Well I would like to introduce the term of “big data temperature”, i.e. determine the zone where to process big data best. Let me introduce the 3 big data temperature zones.

Cold Big Data – Data that you normally receive on a hourly, daily or weekly base. It does not really matter how fast the data is being delivered. It might be caused by the fact that you data delivery site or your data broker is not able to transfer the data more immediately. The reason why it can not be delivered more recent could be also caused by the technology receiving the the data chunks (e.g. Hadoop likes to process major blocks of data at once).

Warm Big Data – Whenever you need to process big data that has to be available on demand, but not instantaneously, this is a right landing zone for your data. Warm big data can be stored in-memory data grids or fast noSQL databases. Whenever your warm big data needs to be retained for a longer time (getting colder) it is recommend to let the data flow into the cold big data zone.

Hot Big Data – You have to make instant decisions during the time when you receive the data. This is critical when you process new data in motion sources, e.g. sensor or location data. Technologies such as Complex Event Processing or Low Latency Messaging are tailored towards that purpose. In case you need to retain the data for subsequent access or you want to apply identified patterns of historic data it is recommended to move data to warm or cold.

The idea behind big data temperature zones should not be seen in isolation, but as a flowing data concept that is always supported by the best of breed technology – assuming that for the near future we will not have one technology that covers all zones.