Communication, Transparency and the tools that help.

Shannon here wearing my communication’s hat today. One of the big goals recently has been the streamlining of communication both within the foundation but as well as communicating with the community. Since that time I have been working to pull many of our channels of communication into one location and working with David and Adam to make the channels we do have be easier to search and consume. I’m releasing some of those tools now and want to take a few minutes to explain what we are currently doing and my goals moving forward.

In a post by David he mentioned a program developed I think for GitHub called HUBOT which was an automated interactive agent that a team can communicate with to do various tasks, like answering queries or relaying messages. The project is widely used and respected. I back-burnered the addition of this software into our internal stack early on. About a month and a half ago I was reading an article about the executive who created flickr as an internal project for a video game project that eventually went belly up (Read more here). The article went on to talk about a new piece of software “Slack” that his current team developed to handle internal communication and I was intrigued. I promptly installed and started trying to understand the pull power of what it could do.

Sslacklack on the surface looks much like a web based IRC implementation, complete with rooms prefixed with a # (#general, #random) that are easily created by any user using the system. But looking a little deeper at the concept of integration, the true of power of slack slowly starts to creep in. At it’s heart the Slack environment encourages adding integrations to your system. An integration is essentially an API connected to the slack infrastructure.

They offer roughly 60 connections to various API’s, some of which open up other APi lists. They also offer simple web hooks and event based scripting. This opens the door for nearly unlimited combination of actions that can be scripted. You can have external forces or events trigger things to happen in one or all of your slack “rooms” you can have events that happen in a slack room trigger something to happen externally. The options are really quite amazing. But as with most things that have such a powerful capability, it’s up to the user to decide how best to harness this.

Quickly an example of a task that I was able to automate in a matter of minutes without writing any code is as follows: The reminding of team members, publicly about their blog posting schedule.

First I created a calendar specifically to keep events that correlate with each member’s blog posting day.
I created an event that fires a few minute before each event.
when this event fires it triggers an HTTP POST call that takes the details of the calendar event and passes the values into the “slackbot” api
Slackbot APi takes the values and sends a message to the #general channel: slack.reminderThis automation will run until I turn it off and took literally minutes to create. This was sort of a whimsical automation because I could and I needed a way to gently remind people of what’s being published today and by who.

Part of our goal of transparency was making our Skype group chats more easily accessible, stage 1 of this is complete and we now have 4 skype channels that are logged and the messages pumped into a slack channel representation of the skype group. Stage 2 will be to relay these same messages to to allow people without skype to be able to see what’s going happening. I’ll issue another update when the skype=>forum integration is complete. Here is the github repository that contains the application I wrote to help log the Skype group chats. It requires only your slack api endpoint and api key along with the groups you are interested in logging and the matching slack channel.


Another feature we are working on is a singular publishing point from within slack. We created plugins for both Facebook and Twitter that interface with slack through a project very similar to HUBOT called MMBOT, a C# port of HUBOT, I’m at heart a .Net and Java developer so I jumped at a C# bot that I can use to tie together the missing pieces.

Here are the plugins I wrote to facilitate the publishing of tweets and facebook posts. This particular integration uses the Zapier API which brings with it over 80 additional API connections that can be consumed using slack, hubot, mmbot, etc.



Communication, Transparency and the tools that help.

Genercoin – the Green ENERgy Asset backed coin born on the Master Protocol by Judith – BizDev at the Mastercoin Foundation

Dear Masterminds,

We are extremely fortunate to provide a platform for innovative projects  We’ve been working with variety of innovators, and experts who want to create amazing projects.

 Today, I would like to introduce you to GENERcoin.

GENERcoin is the Green ENERgy Asset Backed Coin that is backed by Arterran Renewables innovative solid bio fuel made from sustainable non food sources, such as manure, and municipal solid waste that is a direct replacement to coal. Coal contributes 40% of green house gas emissions in the US alone(source US EPA). Each GEC is a receipt and claim for the bio fuel that backs each coin.

GENERcoin is modeled from this concept and is backed by the deliverable renewable green energy assets provided by Arterran Renewables NextGen Solid Bio fuel. Each GENERcoin, built upon the Master Protocol, is a claim, a receipt, for the energy backing each coin which may be redeemed, traded or exchanged according to the holders wish.

The value of GENERcoin is represented by the deliverable energy it is backed by, which can be more stable than debt assets which may be devalued due to inflation.

If you want to know more about the GenerCoin project, please visit:


Great things are happening to the Master Protocol

Point of contact for companies and organizations who want to issue a token (Judith Jakubovics) Judith (-at-) and on Skype: Judith.Jakubovics

Genercoin – the Green ENERgy Asset backed coin born on the Master Protocol by Judith – BizDev at the Mastercoin Foundation

Technology Update: Sleep when you’re dead.

It’s been another frenetic and massively productive week in the Mastermind world, with new features flying in to the codebase faster than we’ve seen before.

This week we’ve pulled in more RPC updates, interface updates, manual (unlocked) issuance of assets (grant, revoke), p2sh multi-sig, and additional re-org protection – and that’s just in Master Core. No less than four third-party integrators (exchanges and wallets) have begun tying Master Core into their backend systems this week, and next week is going to be a frenzy of testing, building, testing, packaging and testing.

Two of the groups who are white-labeling Omniwallet have also provided some exceptional feedback and contributions in terms of scalability, security and interface, and the community that is growing around the Master Protocol is finally beginning to see the capabilities that are about to be unleashed.

On the testing front, we’ve updated our spock engine to allow for more automated tests on commit and builds, working through “cumulative hashes” that allow all clients to know that the balances they are using are the same as all other clients.

For Omniwallet, we’re soliciting feedback from users in our “What’s in Our Wallet” survey (which, if you haven’t yet, submit your thoughts – it will guide the future direction of Omni), populated the OmniEngine backend database, added logic processing from DEx payments, and begin integrating the new front-end API.

QA and testing were hot on the plate this week, as we welcome a new team member who will be focused on test plans and quality assurance.

Exciting new ideas are being pumped into the spec on an almost daily basis: smart property administration, futures contracts, ways to decrease transaction fees, and minimizing the blockchain storage requirements of Master Protocol transactions.

The Master Core UI is getting closer to release, as well; below can be seen the Balances tab:

All in all, the team has been working around the clock, continuing to knock the socks off anyone in the general vicinity. As an old colleague once told me, when referring to working non-stop: “you can sleep when you’re dead.”  Our next release will be coming soon, and with it the results of months of non-stop effort will be shared with the world.  

As always, we solicit and invite contributors to come and comment, critique or contribute. The more masterminds the better. Ask me anything, and keep your eyes peeled for what’s coming.

Craig Sellars

CTO, Mastercoin Foundation

craig (at)

Technology Update: Sleep when you’re dead.

Omni: A Tale of Two

Omniwallet today has two major pieces that are the focus of the team in order to leverage the new database infrastructure that will power the next revision of the web wallet.

The first piece has been under heavy development for the past 3 weeks: the back end module which populates the database with usable data, gloriously called OmniEngine. It is the engine behind Omni, driving the data into the correct place in the database and dragging the data onto the page for the user to see.

At present it can parse, store, retrieve and handle all Bitcoin transactions as well as most Mastercoin transactions. Thanks to frequent updates to Master Core, we can now add support and logic for Distributed EXchange payments as well as the transactions used in the upcoming MetaDEx feature.

The second piece, on which we have officially broken ground today, is the front end api which allows Omniwallet to properly interact and retrieve data from the database. Just because we have the data is no good if you can’t use it – and this api will allow Omniwallet (and in the future, other tools) to properly extract relevant transaction data and make it intelligible and presentable to the end-user.

We’re hoping to have most of these puzzle pieces in place and ready within the next few weeks, so keep an eye out for updates.

Omni: A Tale of Two

Use-case of Master Protocol- DApps

To those heavily involved in the community the use cases of Mastercoin outlined in the whitepaper goes without saying.

Savings accounts, CFD and other smart contracts, the ability to digitize tangible and real assets…

I really want to point our attention to and emphasize the use case of using tokens that give access to software in the model of the “Decentralized Application”.

The Decentralized Application model addresses a few problems that have been historically relevant to entrepreneurship: early access to liquidity, monetization of open source, finding the best talent to build the product, capturing early adopters.

In my AMA today I’d like to focus on the merits of the model if any of these have been your concern in developing a value-generating service.

Please find me on Reddit, for my AMA for in depth discussion.

Use-case of Master Protocol- DApps