Bitcoin Mining Photos and Premium High Res - Getty Images

Eidoo Wallet App's official subreddit

Fast, easy, and not only a Multicurrency Wallet: it's a hybrid exchange too. It’s Eidoo. This is Eidoo App's official subreddit where we talk only about Eidoo and its ecosystem.
[link]

Liquid Subreddit | Liquid.com

Welcome to /Liquid, a community-driven sub where you can discuss anything involving Liquid Exchange and QASH, soon to be re-branded as the Liquid Token.
[link]

BiblePay (BBP)

BiblePay (BBP) is a Charity Christian Cryptocurrency that donates 10% of coins to Charity every month, sponsoring orphans
[link]

My day trading experience with Bitcoin, summarized in one image

My day trading experience with Bitcoin, summarized in one image submitted by jonesyjonesy to Bitcoin [link] [comments]

Battlestate Games Needs to update "Rules of the Game"

According to the "Rules of the Game" and TOS for EFT on https://www.escapefromtarkov.com/ there is nothing said about a player not being able to share the wealth. The only place you can find this rule is here on twitter.
BoOsTiNg
Now If you happen to be me I was recently banned falsely for RMT or "boosting". I was not aware that such a rule had existed and unknowingly had broken a rule that was only posted on a social media platform that I do not have. There is nothing on the launcher of the game stating that players cant "share the wealth" and there is no big red text in-game that warns you about "boosting".
Perhaps, they are just scrambling to deal with the cheating situation since RMT results in larger volumes of cheating...good luck playing labs.
I am banned from posting on their forum as a result of this all because I do not have twitter and if I did I was not following them. Someone, please post on their forum that TOS and "Rules of the Game" need to be updated.
This is the RMT statement in the License Agreement from https://www.escapefromtarkov.com/ " Prohibited commercial use: use, either in total or individual parts, of the Launcher Application or of the Game for any purposes not expressly permitted by Battlestate Games Limited, including, among other things: (i) use of the Game in commercial establishments; (ii) collection of game money, items, materials, resources, etc. for sale outside the Game; (iii) provision of game services, such as raising the level, in exchange for payment outside the Game; or (iv) transfer or facilitation of distribution (by text or audio means or any other method) of commercial advertising or offers through or within the scope of the Game"

https://preview.redd.it/gtu3onnxalf51.png?width=1920&format=png&auto=webp&s=28b52a780d63004a15874d33a0ab78466c624d50
Below I will detail the actions that resulted in my ban, and I want to make it very clear that this is NOT AN APPEAL to be unbanned and it is not reposting of previous content except the images. I am explaining the circumstances in which the ToS and "rules of the game" not being updated resulted in a ban. MANY PEOPLE ARE ASKING ABOUT THE EXTENT TO WHICH ITEMS WERE LEANT OR GIVEN SO I AM ADRESSING THAT WITH THE BELOW INFORMATION.
Example 1
July 10th my friend was done playing EFT so he gave me his RR and SICC case

RR and SICC
Example 2 August 3rd
I recently reachout out to my friend in the Army who I had not spoken to in a while and the two of us were going to start grinding tarkov together. He was new to the game and I wanted us to be on the same level with gear, but I still wanted to keep my valuables such as my keys, sicc case, docs cases, dog tag case, Red Rebel and 6 bitcoin to be used for the weapons case trade. The plan was that after I reset my account he would throw me that stuff back. On August 3rd we met on shorline and I gave him my stuff except for the RR because I forgot to put it in my scabbard. On August 4th I gave him my Red Rebel and reset my account immediatley after. Once my account was reset he threw everything I gave him back to me.

Proof of reset

he was a new player lv 4 to be exact
Example 3
On August 5th I leant my friend a blue, green, violet, and black card here. He got cheated on and gave trhe cards back to me 2 hours later. HERE IS THE LINK TO THE BROADCAST https://www.twitch.tv/videos/701225050
Between the times 44:45 and 46:00 you can hear them talking about the borrowed cards. Here is the clip for that https://clips.twitch.tv/CourteousCrepuscularGuanacoNerfRedBlaster

Conversations about borrowing keycards

twitch chat about it
So for anyone who may have questions about circumstances in which the ToS and "Rules of the Game" here is an example of why they need to be updated.


submitted by Schmuckalow to EscapefromTarkov [link] [comments]

12-30 20:12 - 'Mcafee Trading bots can read images now, wtf' (youtube.com) by /u/pietpatat removed from /r/Bitcoin within 0-10min

Mcafee Trading bots can read images now, wtf
Go1dfish undelete link
unreddit undelete link
Author: pietpatat
submitted by removalbot to removalbot [link] [comments]

Stop asking if you should BUY or SELL, here is an image that will explain Bitcoin trading to you in seconds.

Stop asking if you should BUY or SELL, here is an image that will explain Bitcoin trading to you in seconds. submitted by CryptoBaby to Bitcoin [link] [comments]

Your Pre Market Brief for 08/27/2020

Your Pre Market Brief for Thursday August 27th 2020

You can subscribe to the daily 4:00 AM Pre Market Brief on The Twitter Link Here . Alerts in the tweets will direct you to the daily 4:00 AM Pre Market Brief in this sub.
Morning Research and Trading Prep Tool Kit
The Ultimate Quick Resource For the Amateur Trader.
Published 3:00 AM EST / Updated as of 3:30 AM EST
-----------------------------------------------
Stock Futures:
Wednesday 08/25/2020 News and Markets Recap:
Thursday August 27th 2020 Economic Calendar (All times are Eastern)

TODAY: GDP AND UNEMPLOYMENT!!!!

ALSO PENDING HOME SALES
Overnight News Heading into Thursday August 27th 2020
(News Yet to be Traded 8:00 PM - 4:00 AM EST)
End of Day and After Hours News Heading into Thursday August 27th 2020
(News Traded 4:00 PM - 8:00 PM EST)
Offering News
Note: Seeking A url's and Reddit do not get along.
Upcoming Earnings:
-----------------------------------------------
Morning Research and Trading Prep Tool Kit
Other Useful Resources:
The Ultimate Quick Resource For the Amateur Trader.
Subscribe to This Brief and the daily 4:00 AM Pre Market Brief on The Twitter Link Here . Alerts in the tweets will direct you to the daily brief in this sub
It is up to you to judge the accuracy and veracity of the above before trading. I take no responsibility for the accuracy of the information in this thread.
submitted by Cicero1982 to pennystocks [link] [comments]

[Spoilers S7] Here's what we know about the state of Earth before the bombs

Here's a compiled list of what Earth was like pre-apocalypse using details from the show. Jason Rothenberg has said if the prequel gets greenlit, he wants to implement a lot of flashbacks LOST style. These flashbacks may probably include references to the following:

Oil Depletion

Dust Storms

Water Shortages

Global Warming

Global Pandemic

Overpopulation

Technological Advancements

Becca Franko, The Tech Celebrity

Financial Crisis

Drug Legalization

Battles in U.S. Cities

Resistance Groups & Terrorism

Corrupt U.S. Government

Easy For Cults to Thrive

That's what I got. If you spotted anything else from the show, feel free to share! :)
submitted by Sharoza to The100 [link] [comments]

Coinbase.com cheats their customers in plain sight. Why do we put up with this?

Last night I needed a small amount of crypto and decided to purchase some on coinbase.com (normally I use coinbase pro for trading, but this was a unique situation).
I was already dreading it because I knew I would be paying a premium to buy with my debit card, but hey, that's that convenience tax for you. What I wasn't expecting was to also get ripped off on an imaginary spread. I have purchased with my debit card on Coinbase in the past, and never paid much attention to the "price per coin" as I thought this was the grand total (after including the high premium for paying with a debit card). However, last night, I decided to actually do the math.
The purchase was for $140 total at 2:57 PM EST March 23. The transaction fee was $5.37 so that means I purchased $134.63 worth of Bitcoin.
The price for BTC at that time was $11674.90 so that means I should have received
0.01153157628 BTC
However what I received was:
.01148441 BTC
or a difference of about
0.00004716628 or ($.059 USD)
If you do the math, and factor in that difference, it will line up the price per coin they show in the image at $11722.85.
This can not be summed up to market movement and paying the spread, as the price for BTC had not been anywhere near $11722.85 for a couple of hours before my purchase.
The amounts here are small and pretty irrelevant. The important point here is to illustrate that Coinbase is charging customers a "ghost" spread when they buy on coinbase.com.
submitted by el_diablo_robotico to CryptoCurrency [link] [comments]

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Public Proposal TL;DR:

Dragonchain has demonstrated twice Reddit’s entire total daily volume (votes, comments, and posts per Reddit 2019 Year in Review) in a 24-hour demo on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. At the time, in January 2020, the entire cost of the demo was approximately $25K on a single system (transaction fees locked at $0.0001/txn). With current fees (lowest fee $0.0000025/txn), this would cost as little as $625.
Watch Joe walk through the entire proposal and answer questions on YouTube.
This proposal is also available on the Dragonchain blog.

Hello Reddit and Ethereum community!

I’m Joe Roets, Founder & CEO of Dragonchain. When the team and I first heard about The Great Reddit Scaling Bake-Off we were intrigued. We believe we have the solutions Reddit seeks for its community points system and we have them at scale.
For your consideration, we have submitted our proposal below. The team at Dragonchain and I welcome and look forward to your technical questions, philosophical feedback, and fair criticism, to build a scaling solution for Reddit that will empower its users. Because our architecture is unlike other blockchain platforms out there today, we expect to receive many questions while people try to grasp our project. I will answer all questions here in this thread on Reddit, and I've answered some questions in the stream on YouTube.
We have seen good discussions so far in the competition. We hope that Reddit’s scaling solution will emerge from The Great Reddit Scaling Bake-Off and that Reddit will have great success with the implementation.

Executive summary

Dragonchain is a robust open source hybrid blockchain platform that has proven to withstand the passing of time since our inception in 2014. We have continued to evolve to harness the scalability of private nodes, yet take full advantage of the security of public decentralized networks, like Ethereum. We have a live, operational, and fully functional Interchain network integrating Bitcoin, Ethereum, Ethereum Classic, and ~700 independent Dragonchain nodes. Every transaction is secured to Ethereum, Bitcoin, and Ethereum Classic. Transactions are immediately usable on chain, and the first decentralization is seen within 20 seconds on Dragon Net. Security increases further to public networks ETH, BTC, and ETC within 10 minutes to 2 hours. Smart contracts can be written in any executable language, offering full freedom to existing developers. We invite any developer to watch the demo, play with our SDK’s, review open source code, and to help us move forward. Dragonchain specializes in scalable loyalty & rewards solutions and has built a decentralized social network on chain, with very affordable transaction costs. This experience can be combined with the insights Reddit and the Ethereum community have gained in the past couple of months to roll out the solution at a rapid pace.

Response and PoC

In The Great Reddit Scaling Bake-Off post, Reddit has asked for a series of demonstrations, requirements, and other considerations. In this section, we will attempt to answer all of these requests.

Live Demo

A live proof of concept showing hundreds of thousands of transactions
On Jan 7, 2020, Dragonchain hosted a 24-hour live demonstration during which a quarter of a billion (250 million+) transactions executed fully on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. This means that every single transaction is secured by, and traceable to these networks. An attack on this system would require a simultaneous attack on all of the Interchained networks.
24 hours in 4 minutes (YouTube):
24 hours in 4 minutes
The demonstration was of a single business system, and any user is able to scale this further, by running multiple systems simultaneously. Our goals for the event were to demonstrate a consistent capacity greater than that of Visa over an extended time period.
Tooling to reproduce our demo is available here:
https://github.com/dragonchain/spirit-bomb

Source Code

Source code (for on & off-chain components as well tooling used for the PoC). The source code does not have to be shared publicly, but if Reddit decides to use a particular solution it will need to be shared with Reddit at some point.

Scaling

How it works & scales

Architectural Scaling

Dragonchain’s architecture attacks the scalability issue from multiple angles. Dragonchain is a hybrid blockchain platform, wherein every transaction is protected on a business node to the requirements of that business or purpose. A business node may be held completely private or may be exposed or replicated to any level of exposure desired.
Every node has its own blockchain and is independently scalable. Dragonchain established Context Based Verification as its consensus model. Every transaction is immediately usable on a trust basis, and in time is provable to an increasing level of decentralized consensus. A transaction will have a level of decentralization to independently owned and deployed Dragonchain nodes (~700 nodes) within seconds, and full decentralization to BTC and ETH within minutes or hours. Level 5 nodes (Interchain nodes) function to secure all transactions to public or otherwise external chains such as Bitcoin and Ethereum. These nodes scale the system by aggregating multiple blocks into a single Interchain transaction on a cadence. This timing is configurable based upon average fees for each respective chain. For detailed information about Dragonchain’s architecture, and Context Based Verification, please refer to the Dragonchain Architecture Document.

Economic Scaling

An interesting feature of Dragonchain’s network consensus is its economics and scarcity model. Since Dragon Net nodes (L2-L4) are independent staking nodes, deployment to cloud platforms would allow any of these nodes to scale to take on a large percentage of the verification work. This is great for scalability, but not good for the economy, because there is no scarcity, and pricing would develop a downward spiral and result in fewer verification nodes. For this reason, Dragonchain uses TIME as scarcity.
TIME is calculated as the number of Dragons held, multiplied by the number of days held. TIME influences the user’s access to features within the Dragonchain ecosystem. It takes into account both the Dragon balance and length of time each Dragon is held. TIME is staked by users against every verification node and dictates how much of the transaction fees are awarded to each participating node for every block.
TIME also dictates the transaction fee itself for the business node. TIME is staked against a business node to set a deterministic transaction fee level (see transaction fee table below in Cost section). This is very interesting in a discussion about scaling because it guarantees independence for business implementation. No matter how much traffic appears on the entire network, a business is guaranteed to not see an increased transaction fee rate.

Scaled Deployment

Dragonchain uses Docker and Kubernetes to allow the use of best practices traditional system scaling. Dragonchain offers managed nodes with an easy to use web based console interface. The user may also deploy a Dragonchain node within their own datacenter or favorite cloud platform. Users have deployed Dragonchain nodes on-prem on Amazon AWS, Google Cloud, MS Azure, and other hosting platforms around the world. Any executable code, anything you can write, can be written into a smart contract. This flexibility is what allows us to say that developers with no blockchain experience can use any code language to access the benefits of blockchain. Customers have used NodeJS, Python, Java, and even BASH shell script to write smart contracts on Dragonchain.
With Docker containers, we achieve better separation of concerns, faster deployment, higher reliability, and lower response times.
We chose Kubernetes for its self-healing features, ability to run multiple services on one server, and its large and thriving development community. It is resilient, scalable, and automated. OpenFaaS allows us to package smart contracts as Docker images for easy deployment.
Contract deployment time is now bounded only by the size of the Docker image being deployed but remains fast even for reasonably large images. We also take advantage of Docker’s flexibility and its ability to support any language that can run on x86 architecture. Any image, public or private, can be run as a smart contract using Dragonchain.

Flexibility in Scaling

Dragonchain’s architecture considers interoperability and integration as key features. From inception, we had a goal to increase adoption via integration with real business use cases and traditional systems.
We envision the ability for Reddit, in the future, to be able to integrate alternate content storage platforms or other financial services along with the token.
  • LBRY - To allow users to deploy content natively to LBRY
  • MakerDAO to allow users to lend small amounts backed by their Reddit community points.
  • STORJ/SIA to allow decentralized on chain storage of portions of content. These integrations or any other are relatively easy to integrate on Dragonchain with an Interchain implementation.

Cost

Cost estimates (on-chain and off-chain) For the purpose of this proposal, we assume that all transactions are on chain (posts, replies, and votes).
On the Dragonchain network, transaction costs are deterministic/predictable. By staking TIME on the business node (as described above) Reddit can reduce transaction costs to as low as $0.0000025 per transaction.
Dragonchain Fees Table

Getting Started

How to run it
Building on Dragonchain is simple and requires no blockchain experience. Spin up a business node (L1) in our managed environment (AWS), run it in your own cloud environment, or on-prem in your own datacenter. Clear documentation will walk you through the steps of spinning up your first Dragonchain Level 1 Business node.
Getting started is easy...
  1. Download Dragonchain’s dctl
  2. Input three commands into a terminal
  3. Build an image
  4. Run it
More information can be found in our Get started documents.

Architecture
Dragonchain is an open source hybrid platform. Through Dragon Net, each chain combines the power of a public blockchain (like Ethereum) with the privacy of a private blockchain.
Dragonchain organizes its network into five separate levels. A Level 1, or business node, is a totally private blockchain only accessible through the use of public/private keypairs. All business logic, including smart contracts, can be executed on this node directly and added to the chain.
After creating a block, the Level 1 business node broadcasts a version stripped of sensitive private data to Dragon Net. Three Level 2 Validating nodes validate the transaction based on guidelines determined from the business. A Level 3 Diversity node checks that the level 2 nodes are from a diverse array of locations. A Level 4 Notary node, hosted by a KYC partner, then signs the validation record received from the Level 3 node. The transaction hash is ledgered to the Level 5 public chain to take advantage of the hash power of massive public networks.
Dragon Net can be thought of as a “blockchain of blockchains”, where every level is a complete private blockchain. Because an L1 can send to multiple nodes on a single level, proof of existence is distributed among many places in the network. Eventually, proof of existence reaches level 5 and is published on a public network.

API Documentation

APIs (on chain & off)

SDK Source

Nobody’s Perfect

Known issues or tradeoffs
  • Dragonchain is open source and even though the platform is easy enough for developers to code in any language they are comfortable with, we do not have so large a developer community as Ethereum. We would like to see the Ethereum developer community (and any other communities) become familiar with our SDK’s, our solutions, and our platform, to unlock the full potential of our Ethereum Interchain. Long ago we decided to prioritize both Bitcoin and Ethereum Interchains. We envision an ecosystem that encompasses different projects to give developers the ability to take full advantage of all the opportunities blockchain offers to create decentralized solutions not only for Reddit but for all of our current platforms and systems. We believe that together we will take the adoption of blockchain further. We currently have additional Interchain with Ethereum Classic. We look forward to Interchain with other blockchains in the future. We invite all blockchains projects who believe in decentralization and security to Interchain with Dragonchain.
  • While we only have 700 nodes compared to 8,000 Ethereum and 10,000 Bitcoin nodes. We harness those 18,000 nodes to scale to extremely high levels of security. See Dragonchain metrics.
  • Some may consider the centralization of Dragonchain’s business nodes as an issue at first glance, however, the model is by design to protect business data. We do not consider this a drawback as these nodes can make any, none, or all data public. Depending upon the implementation, every subreddit could have control of its own business node, for potential business and enterprise offerings, bringing new alternative revenue streams to Reddit.

Costs and resources

Summary of cost & resource information for both on-chain & off-chain components used in the PoC, as well as cost & resource estimates for further scaling. If your PoC is not on mainnet, make note of any mainnet caveats (such as congestion issues).
Every transaction on the PoC system had a transaction fee of $0.0001 (one-hundredth of a cent USD). At 256MM transactions, the demo cost $25,600. With current operational fees, the same demonstration would cost $640 USD.
For the demonstration, to achieve throughput to mimic a worldwide payments network, we modeled several clients in AWS and 4-5 business nodes to handle the traffic. The business nodes were tuned to handle higher throughput by adjusting memory and machine footprint on AWS. This flexibility is valuable to implementing a system such as envisioned by Reddit. Given that Reddit’s daily traffic (posts, replies, and votes) is less than half that of our demo, we would expect that the entire Reddit system could be handled on 2-5 business nodes using right-sized containers on AWS or similar environments.
Verification was accomplished on the operational Dragon Net network with over 700 independently owned verification nodes running around the world at no cost to the business other than paid transaction fees.

Requirements

Scaling

This PoC should scale to the numbers below with minimal costs (both on & off-chain). There should also be a clear path to supporting hundreds of millions of users.
Over a 5 day period, your scaling PoC should be able to handle:
*100,000 point claims (minting & distributing points) *25,000 subscriptions *75,000 one-off points burning *100,000 transfers
During Dragonchain’s 24 hour demo, the above required numbers were reached within the first few minutes.
Reddit’s total activity is 9000% more than Ethereum’s total transaction level. Even if you do not include votes, it is still 700% more than Ethereum’s current volume. Dragonchain has demonstrated that it can handle 250 million transactions a day, and it’s architecture allows for multiple systems to work at that level simultaneously. In our PoC, we demonstrate double the full capacity of Reddit, and every transaction was proven all the way to Bitcoin and Ethereum.
Reddit Scaling on Ethereum

Decentralization

Solutions should not depend on any single third-party provider. We prefer solutions that do not depend on specific entities such as Reddit or another provider, and solutions with no single point of control or failure in off-chain components but recognize there are numerous trade-offs to consider
Dragonchain’s architecture calls for a hybrid approach. Private business nodes hold the sensitive data while the validation and verification of transactions for the business are decentralized within seconds and secured to public blockchains within 10 minutes to 2 hours. Nodes could potentially be controlled by owners of individual subreddits for more organic decentralization.
  • Billing is currently centralized - there is a path to federation and decentralization of a scaled billing solution.
  • Operational multi-cloud
  • Operational on-premises capabilities
  • Operational deployment to any datacenter
  • Over 700 independent Community Verification Nodes with proof of ownership
  • Operational Interchain (Interoperable to Bitcoin, Ethereum, and Ethereum Classic, open to more)

Usability Scaling solutions should have a simple end user experience.

Users shouldn't have to maintain any extra state/proofs, regularly monitor activity, keep track of extra keys, or sign anything other than their normal transactions
Dragonchain and its customers have demonstrated extraordinary usability as a feature in many applications, where users do not need to know that the system is backed by a live blockchain. Lyceum is one of these examples, where the progress of academy courses is being tracked, and successful completion of courses is rewarded with certificates on chain. Our @Save_The_Tweet bot is popular on Twitter. When used with one of the following hashtags - #please, #blockchain, #ThankYou, or #eternalize the tweet is saved through Eternal to multiple blockchains. A proof report is available for future reference. Other examples in use are DEN, our decentralized social media platform, and our console, where users can track their node rewards, view their TIME, and operate a business node.
Examples:

Transactions complete in a reasonable amount of time (seconds or minutes, not hours or days)
All transactions are immediately usable on chain by the system. A transaction begins the path to decentralization at the conclusion of a 5-second block when it gets distributed across 5 separate community run nodes. Full decentralization occurs within 10 minutes to 2 hours depending on which interchain (Bitcoin, Ethereum, or Ethereum Classic) the transaction hits first. Within approximately 2 hours, the combined hash power of all interchained blockchains secures the transaction.

Free to use for end users (no gas fees, or fixed/minimal fees that Reddit can pay on their behalf)
With transaction pricing as low as $0.0000025 per transaction, it may be considered reasonable for Reddit to cover transaction fees for users.
All of Reddit's Transactions on Blockchain (month)
Community points can be earned by users and distributed directly to their Reddit account in batch (as per Reddit minting plan), and allow users to withdraw rewards to their Ethereum wallet whenever they wish. Withdrawal fees can be paid by either user or Reddit. This model has been operating inside the Dragonchain system since 2018, and many security and financial compliance features can be optionally added. We feel that this capability greatly enhances user experience because it is seamless to a regular user without cryptocurrency experience, yet flexible to a tech savvy user. With regard to currency or token transactions, these would occur on the Reddit network, verified to BTC and ETH. These transactions would incur the $0.0000025 transaction fee. To estimate this fee we use the monthly active Reddit users statista with a 60% adoption rate and an estimated 10 transactions per month average resulting in an approximate $720 cost across the system. Reddit could feasibly incur all associated internal network charges (mining/minting, transfer, burn) as these are very low and controllable fees.
Reddit Internal Token Transaction Fees

Reddit Ethereum Token Transaction Fees
When we consider further the Ethereum fees that might be incurred, we have a few choices for a solution.
  1. Offload all Ethereum transaction fees (user withdrawals) to interested users as they wish to withdraw tokens for external use or sale.
  2. Cover Ethereum transaction fees by aggregating them on a timed schedule. Users would request withdrawal (from Reddit or individual subreddits), and they would be transacted on the Ethereum network every hour (or some other schedule).
  3. In a combination of the above, customers could cover aggregated fees.
  4. Integrate with alternate Ethereum roll up solutions or other proposals to aggregate minting and distribution transactions onto Ethereum.

Bonus Points

Users should be able to view their balances & transactions via a blockchain explorer-style interface
From interfaces for users who have no knowledge of blockchain technology to users who are well versed in blockchain terms such as those present in a typical block explorer, a system powered by Dragonchain has flexibility on how to provide balances and transaction data to users. Transactions can be made viewable in an Eternal Proof Report, which displays raw data along with TIME staking information and traceability all the way to Bitcoin, Ethereum, and every other Interchained network. The report shows fields such as transaction ID, timestamp, block ID, multiple verifications, and Interchain proof. See example here.
Node payouts within the Dragonchain console are listed in chronological order and can be further seen in either Dragons or USD. See example here.
In our social media platform, Dragon Den, users can see, in real-time, their NRG and MTR balances. See example here.
A new influencer app powered by Dragonchain, Raiinmaker, breaks down data into a user friendly interface that shows coin portfolio, redeemed rewards, and social scores per campaign. See example here.

Exiting is fast & simple
Withdrawing funds on Dragonchain’s console requires three clicks, however, withdrawal scenarios with more enhanced security features per Reddit’s discretion are obtainable.

Interoperability Compatibility with third party apps (wallets/contracts/etc) is necessary.
Proven interoperability at scale that surpasses the required specifications. Our entire platform consists of interoperable blockchains connected to each other and traditional systems. APIs are well documented. Third party permissions are possible with a simple smart contract without the end user being aware. No need to learn any specialized proprietary language. Any code base (not subsets) is usable within a Docker container. Interoperable with any blockchain or traditional APIs. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js. Please see our source code and API documentation.

Scaling solutions should be extensible and allow third parties to build on top of it Open source and extensible
APIs should be well documented and stable

Documentation should be clear and complete
For full documentation, explore our docs, SDK’s, Github repo’s, architecture documents, original Disney documentation, and other links or resources provided in this proposal.

Third-party permissionless integrations should be possible & straightforward Smart contracts are Docker based, can be written in any language, use full language (not subsets), and can therefore be integrated with any system including traditional system APIs. Simple is better. Learning an uncommon or proprietary language should not be necessary.
Advanced knowledge of mathematics, cryptography, or L2 scaling should not be required. Compatibility with common utilities & toolchains is expected.
Dragonchain business nodes and smart contracts leverage Docker to allow the use of literally any language or executable code. No proprietary language is necessary. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js.

Bonus

Bonus Points: Show us how it works. Do you have an idea for a cool new use case for Community Points? Build it!

TIME

Community points could be awarded to Reddit users based upon TIME too, whereas the longer someone is part of a subreddit, the more community points someone naturally gained, even if not actively commenting or sharing new posts. A daily login could be required for these community points to be credited. This grants awards to readers too and incentivizes readers to create an account on Reddit if they browse the website often. This concept could also be leveraged to provide some level of reputation based upon duration and consistency of contribution to a community subreddit.

Dragon Den

Dragonchain has already built a social media platform that harnesses community involvement. Dragon Den is a decentralized community built on the Dragonchain blockchain platform. Dragon Den is Dragonchain’s answer to fake news, trolling, and censorship. It incentivizes the creation and evaluation of quality content within communities. It could be described as being a shareholder of a subreddit or Reddit in its entirety. The more your subreddit is thriving, the more rewarding it will be. Den is currently in a public beta and in active development, though the real token economy is not live yet. There are different tokens for various purposes. Two tokens are Lair Ownership Rights (LOR) and Lair Ownership Tokens (LOT). LOT is a non-fungible token for ownership of a specific Lair. LOT will only be created and converted from LOR.
Energy (NRG) and Matter (MTR) work jointly. Your MTR determines how much NRG you receive in a 24-hour period. Providing quality content, or evaluating content will earn MTR.

Security. Users have full ownership & control of their points.
All community points awarded based upon any type of activity or gift, are secured and provable to all Interchain networks (currently BTC, ETH, ETC). Users are free to spend and withdraw their points as they please, depending on the features Reddit wants to bring into production.

Balances and transactions cannot be forged, manipulated, or blocked by Reddit or anyone else
Users can withdraw their balance to their ERC20 wallet, directly through Reddit. Reddit can cover the fees on their behalf, or the user covers this with a portion of their balance.

Users should own their points and be able to get on-chain ERC20 tokens without permission from anyone else
Through our console users can withdraw their ERC20 rewards. This can be achieved on Reddit too. Here is a walkthrough of our console, though this does not show the quick withdrawal functionality, a user can withdraw at any time. https://www.youtube.com/watch?v=aNlTMxnfVHw

Points should be recoverable to on-chain ERC20 tokens even if all third-parties involved go offline
If necessary, signed transactions from the Reddit system (e.g. Reddit + Subreddit) can be sent to the Ethereum smart contract for minting.

A public, third-party review attesting to the soundness of the design should be available
To our knowledge, at least two large corporations, including a top 3 accounting firm, have conducted positive reviews. These reviews have never been made public, as Dragonchain did not pay or contract for these studies to be released.

Bonus points
Public, third-party implementation review available or in progress
See above

Compatibility with HSMs & hardware wallets
For the purpose of this proposal, all tokenization would be on the Ethereum network using standard token contracts and as such, would be able to leverage all hardware wallet and Ethereum ecosystem services.

Other Considerations

Minting/distributing tokens is not performed by Reddit directly
This operation can be automated by smart contract on Ethereum. Subreddits can if desired have a role to play.

One off point burning, as well as recurring, non-interactive point burning (for subreddit memberships) should be possible and scalable
This is possible and scalable with interaction between Dragonchain Reddit system and Ethereum token contract(s).

Fully open-source solutions are strongly preferred
Dragonchain is fully open source (see section on Disney release after conclusion).

Conclusion

Whether it is today, or in the future, we would like to work together to bring secure flexibility to the highest standards. It is our hope to be considered by Ethereum, Reddit, and other integrative solutions so we may further discuss the possibilities of implementation. In our public demonstration, 256 million transactions were handled in our operational network on chain in 24 hours, for the low cost of $25K, which if run today would cost $625. Dragonchain’s interoperable foundation provides the atmosphere necessary to implement a frictionless community points system. Thank you for your consideration of our proposal. We look forward to working with the community to make something great!

Disney Releases Blockchain Platform as Open Source

The team at Disney created the Disney Private Blockchain Platform. The system was a hybrid interoperable blockchain platform for ledgering and smart contract development geared toward solving problems with blockchain adoption and usability. All objective evaluation would consider the team’s output a success. We released a list of use cases that we explored in some capacity at Disney, and our input on blockchain standardization as part of our participation in the W3C Blockchain Community Group.
https://lists.w3.org/Archives/Public/public-blockchain/2016May/0052.html

Open Source

In 2016, Roets proposed to release the platform as open source to spread the technology outside of Disney, as others within the W3C group were interested in the solutions that had been created inside of Disney.
Following a long process, step by step, the team met requirements for release. Among the requirements, the team had to:
  • Obtain VP support and approval for the release
  • Verify ownership of the software to be released
  • Verify that no proprietary content would be released
  • Convince the organization that there was a value to the open source community
  • Convince the organization that there was a value to Disney
  • Offer the plan for ongoing maintenance of the project outside of Disney
  • Itemize competing projects
  • Verify no conflict of interest
  • Preferred license
  • Change the project name to not use the name Disney, any Disney character, or any other associated IP - proposed Dragonchain - approved
  • Obtain legal approval
  • Approval from corporate, parks, and other business units
  • Approval from multiple Disney patent groups Copyright holder defined by Disney (Disney Connected and Advanced Technologies)
  • Trademark searches conducted for the selected name Dragonchain
  • Obtain IT security approval
  • Manual review of OSS components conducted
  • OWASP Dependency and Vulnerability Check Conducted
  • Obtain technical (software) approval
  • Offer management, process, and financial plans for the maintenance of the project.
  • Meet list of items to be addressed before release
  • Remove all Disney project references and scripts
  • Create a public distribution list for email communications
  • Remove Roets’ direct and internal contact information
  • Create public Slack channel and move from Disney slack channels
  • Create proper labels for issue tracking
  • Rename internal private Github repository
  • Add informative description to Github page
  • Expand README.md with more specific information
  • Add information beyond current “Blockchains are Magic”
  • Add getting started sections and info on cloning/forking the project
  • Add installation details
  • Add uninstall process
  • Add unit, functional, and integration test information
  • Detail how to contribute and get involved
  • Describe the git workflow that the project will use
  • Move to public, non-Disney git repository (Github or Bitbucket)
  • Obtain Disney Open Source Committee approval for release
On top of meeting the above criteria, as part of the process, the maintainer of the project had to receive the codebase on their own personal email and create accounts for maintenance (e.g. Github) with non-Disney accounts. Given the fact that the project spanned multiple business units, Roets was individually responsible for its ongoing maintenance. Because of this, he proposed in the open source application to create a non-profit organization to hold the IP and maintain the project. This was approved by Disney.
The Disney Open Source Committee approved the application known as OSSRELEASE-10, and the code was released on October 2, 2016. Disney decided to not issue a press release.
Original OSSRELASE-10 document

Dragonchain Foundation

The Dragonchain Foundation was created on January 17, 2017. https://den.social/l/Dragonchain/24130078352e485d96d2125082151cf0/dragonchain-and-disney/
submitted by j0j0r0 to ethereum [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Flatten the Curve. #49. Let's Dig into Jade Helm. AI. The Surveillance State. Internet of Things. FISA. Pentagon Preparing for Mass Civil Breakdown. What is Mob Excess Deterrent Using Silent Audio? Stay Aware and Get Ahead of the Curve.

Flatten the Curve. Part 48. Source Here
It's getting crazier day by day now, so are you following the Boy Scout motto?
On this topic, Baden-Powell says: Remember your motto, "Be Prepared." Be prepared for accidents by learning beforehand what you ought to do in the different kinds that are likely to occur. Be prepared to do that thing the moment the accident does occur. In Scouting for Boys, Baden-Powell wrote that to Be Prepared means “you are always in a state of readiness in mind and body to do your duty.”
Why should you be prepared? Because TPTB have been preparing, that’s why.
June 12, 2014: The Guardian • Pentagon preparing for mass civil breakdown. Social science is being militarised to develop 'operational tools' to target peaceful activists and protest movements Source Here
Pentagon preparing for mass civil breakdown. It seemed ludicrous back in 2014, didn't it? Inconceivable. Sure some preppers believed it, but they're always getting ready and nothing happened. Doomsday was always right around the corner, and then the next corner, and on and on. Televangelists have probably accused more politicians of being the antichrist than the number of politicians went to Epstein's Island.
But why would they be preparing for mass civil breakdown? Could it be the same reason as why the miltary is preparing for war, droughts and famines brought about by environmental collapse?
February 20, 2020: History Network • Here’s Why These Six Ancient Civilizations Mysteriously Collapsed. From the Maya to Greenland’s Vikings, check out six civilizations that seemingly disappeared without a trace. Source Here
All of these civilizations vanished because of some combination of exhausting their natural resources, drought, plauge, and the little ice age. Sound familiar? Don't tell me that the Rockefeller Foundation and BlackRock became environmentally aware out of a sense of obligation to the planet. They're setting the groundwork for what's coming down the pipe. This isn't about money anymore, this is about control and survival. Throw out the rulebook because the rules no longer apply.
Do you think the surveillance system is for your protection, or the protection of the state? Don't you think that an era of upcoming calamities will severely damage the communication networks, and thus the surveillance system? It might be prudent to consider that Starlink is being established to make the system redundant, so that they never lose track of the precious worker bees before they can be connected to the AI hive mind, right Elon? Neuralink, don't leave home without it.
But let's not forget about the wonderful world of the Internet of Things.
March 15, 2012 • More and more personal and household devices are connecting to the internet, from your television to your car navigation systems to your light switches. CIA Director David Petraeus cannot wait to spy on you through them. Earlier this month, Petraeus mused about the emergence of an "Internet of Things" -- that is, wired devices -- at a summit for In-Q-Tel, the CIA's venture capital firm. "'Transformational' is an overused word, but I do believe it properly applies to these technologies," Petraeus enthused, "particularly to their effect on clandestine tradecraft." All those new online devices are a treasure trove of data if you're a "person of interest" to the spy community. Once upon a time, spies had to place a bug in your chandelier to hear your conversation. With the rise of the "smart home," you'd be sending tagged, geolocated data that a spy agency can intercept in real time when you use the lighting app on your phone to adjust your living room's ambiance. "Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters -- all connected to the next-generation internet using abundant, low-cost, and high-power computing," Petraeus said, "the latter now going to cloud computing, in many areas greater and greater supercomputing, and, ultimately, heading to quantum computing." Petraeus allowed that these household spy devices "change our notions of secrecy" and prompt a rethink of "our notions of identity and secrecy." All of which is true -- if convenient for a CIA director. The CIA has a lot of legal restrictions against spying on American citizens. But collecting ambient geolocation data from devices is a grayer area, especially after the 2008 carve-outs to the Foreign Intelligence Surveillance Act. Hardware manufacturers, it turns out, store a trove of geolocation data; and some legislators have grown alarmed at how easy it is for the government to track you through your phone or PlayStation. That's not the only data exploit intriguing Petraeus. He's interested in creating new online identities for his undercover spies -- and sweeping away the "digital footprints" of agents who suddenly need to vanish. "Proud parents document the arrival and growth of their future CIA officer in all forms of social media that the world can access for decades to come," Petraeus observed. "Moreover, we have to figure out how to create the digital footprint for new identities for some officers." Source Here
December 19, 2019: New York Times • THE DATA REVIEWED BY TIMES OPINION didn’t come from a telecom or giant tech company, nor did it come from a governmental surveillance operation. It originated from a location data company, one of dozens quietly collecting precise movements using software slipped onto mobile phone apps. You’ve probably never heard of most of the companies — and yet to anyone who has access to this data, your life is an open book. They can see the places you go every moment of the day, whom you meet with or spend the night with, where you pray, whether you visit a methadone clinic, a psychiatrist’s office or a massage parlor. The Times and other news organizations have reported on smartphone tracking in the past. But never with a data set so large. Even still, this file represents just a small slice of what’s collected and sold every day by the location tracking industry — surveillance so omnipresent in our digital lives that it now seems impossible for anyone to avoid. It doesn’t take much imagination to conjure the powers such always-on surveillance can provide an authoritarian regime like China’s. Within America’s own representative democracy, citizens would surely rise up in outrage if the government attempted to mandate that every person above the age of 12 carry a tracking device that revealed their location 24 hours a day. Yet, in the decade since Apple’s App Store was created, Americans have, app by app, consented to just such a system run by private companies. Now, as the decade ends, tens of millions of Americans, including many children, find themselves carrying spies in their pockets during the day and leaving them beside their beds at night — even though the corporations that control their data are far less accountable than the government would be. Source Here
The IoT should be renamed to IoTT (Internet of Tracking Things), shouldn't it. But we can't have people figure out what's really happening, can we? It's a good thing that quantum computing isn't too close, isn’t it?
April 5, 2018: Global News • (Project Maven) Over 3,000 Google employees have a signed a petition in protest against the company’s involvement with a U.S. Department of Defense artificial intelligence (AI) project that studies imagery and could eventually be used to improve drone strikes in the battlefield. Source Here
December 12, 2019 • Palantir took over Project Maven defense contract after Google backed out. Source Here
December 29, 2020: Input • Palantir exec says its work is on par with the Manhattan Project. Comparing AI to most lethal weapon in human history isn’t comforting. SourceHere
August 14, 2020: Venture: • Google researchers use quantum computing to help improve image classification. Source Here
Hmmm. Maybe Apple will be for the little guy? They have always valued privacy rights, right?
October 2, 2013: Vice News • The hacktivist group Anonymous released a video statement with an accompanying Pastebin document claiming that there are definitive links between AuthenTec, the company that developed the iPhone 5S’s fingerprint scanner, and the US government. Source Here
An apple a day helps the NSA. Or Google. Or Microsoft. Or Amazon. Take your pick from the basket, because dem Apple's are all the same. But at least we have fundamental rights, right?
Foreign agent declaration not required • No mention of foreign agent status is made in the Protect America Act of 2007. Under prior FISA rules, persons targeted for surveillance must have been declared as foreign agents before a FISA warrant would be accorded by the FISC court.
'Quasi-anti-terrorism law' for all-forms of intelligence collection • Vastly marketed by U.S. federal and military agencies as a law to prevent terror attacks, the Protect America Act was actually a law focused on the 'acquisition' of desired intelligence information, of unspecified nature. The sole requirement is geolocation outside the United States at time of Directive invocation; pursuant to Authorization or Order invocation, surveillance Directives can be undertaken towards persons targeted for intelligence information gathering. Implementation of Directives can take place inside the United States or outside the United States. No criminal or terrorism investigation of the person need be in play at time of the Directive. All that need be required is that the target be related to an official desire for intelligence information gathering for actions on part of persons involved in surveillance to be granted full immunity from U.S. criminal or civil procedures, under Section 105B(l) of the Act.
Removal of FISA Strictures from warrant authorization; warrants not required • But the most striking aspect of the Protect America Act was the notation that any information gathering did not comprise electronic surveillance. This wording had the effect of removing FISA-related strictures from Protect America Act 2007-related Directives, serving to remove a number of protections for persons targeted, and requirements for persons working for U.S. intelligence agencies.
The acquisition does not constitute electronic surveillance • The removal of the term electronic surveillance from any Protect America Act Directive implied that the FISC court approval was no longer required, as FISA warrants were no longer required. In the place of a warrant was a certification, made by U.S. intelligence officers, which was copied to the Court. In effect, the FISC became less of a court than a registry of pre-approved certifications.Certifications (in place of FISA warrants) were able to be levied ex post facto, in writing to the Court no more than 72 hours after it was made. The Attorney General was to transmit as soon as possible to the Court a sealed copy of the certification that would remain sealed unless the certification was needed to determine the legality of the acquisition.Source Here
Oh. FISA is basically a rubber stamp. And even if it the stage play wasn't pretending to follow the script, would it matter? Who could actually stop it at this point? The cat's out of the bag and Pandoras Box is open.
Controversial debates arose as the Protect America Act was published. Constitutional lawyers and civil liberties experts expressed concerns that this Act authorized massive, wide-ranging information gathering with no oversight. Whereas it placed much focus on communications, the Act allowed for information gathering of all shapes and forms. The ACLU called it the "Police America Act" – "authorized a massive surveillance dragnet", calling the blank-check oversight provisions "meaningless," and calling them a "phony court review of secret procedures."
So the surveillance state doesn't have checks and balances anymore. The state is preparing for Massive Civil Breakdown. They keep warning us about environmental collapse. Got it? Good. Let's keep on keeping on.
The District of Columbia Organic Act of 1871 created a single new district corporation governing the entire federal territory, called the District of Columbia, thus dissolving the three major political subdivisions of the District (Port of Georgetown, the City of Washington, and Washington County) and their governments. Source Here)
The first big leap in corporate personhood from holding mere property and contract rights to possessing more expansive rights was a claim that the Equal Protection Clause applied to corporations. One of the strangest twists in American constitutional law was the moment that corporations gained personhood under the Equal Protection Clause of the Fourteenth Amendment. It occurred in a case called Santa Clara County, and what was odd was that the Supreme Court did not really even decide the matter in the actual opinion. It only appeared in a footnote to the case. What we are likely to have at the conclusion of the Supreme Court term is corporations that are empowered to spend in American elections because of Bellotti and Citizens United; corporations that can make religious objections thanks to Hobby Lobby; and if Jesner turns out as badly as I predict, corporations will be able to aid and abet human rights violations abroad with impunity. Source Here
"Having a corporation would allow people to put property into a collective ownership that could be held with perpetual existence," she says. "So it wouldn't be tied to any one person's lifespan, or subject necessarily to laws regarding inheriting property." Later on, in the United States and elsewhere, the advantages of incorporation were essential to efficient and secure economic development. Unlike partnerships, the corporation continued to exist even if a partner died; there was no unanimity required to do something; shareholders could not be sued individually, only the corporation as a whole, so investors only risked as much as they put into buying shares. Source Here
The way that the Arab Bank may get away with this alleged morally troubling behavior, even though it has a New York branch, is by reasserting the basic argument that was made in Nestle USA and Kiobel II: that the federal Alien Tort Statute was not intended to apply to corporations full stop. Given other cases in this area like Mohamad v. PLO, which held the word “individual” in the Torture Victim Protection Act means a natural person and does not impose any liability against organizations, the Arab Bank’s procorporate argument may well prevail. There are multiple federal Circuit Courts which have shot down the argument that corporations are immune from suit under the Alien Tort Statute. The lone outlier is the Second Circuit, which decided in 2010 that corporations are excused from suit in Kiobel I. This is the case that was appealed to the Supreme Court and became Kiobel II. Jesner v. Arab Bank was litigated in the Second Circuit. One question in Jesner was what exactly did Kiobel II do to Kiobel I. So far in the litigation, Jesner concluded that Kiobel I and its conclusion that corporations can’t be sued in federal court using the Alien Tort Statute remained the controlling law of the Second Circuit.
There's a reason people call lawyers snakes, it's because most of them speak with forked tounges. So the corporation isn't being held liable, but the shareholders can't be held liable either. That's too insane to even be called a Catch 22. We are literally being set up to have no recourse because there isn’t anybody who can be held responsible. Why is that important when I've been talking about the surveillance state?
July 14, 2020: The Intercept • Microsoft’s police surveillance services are often opaque because the company sells little in the way of its own policing products. It instead offers an array of “general purpose” Azure cloud services, such as machine learning and predictive analytics tools like Power BI (business intelligence) and Cognitive Services, which can be used by law enforcement agencies and surveillance vendors to build their own software or solutions. A rich array of Microsoft’s cloud-based offerings is on full display with a concept called “The Connected Officer.” Microsoft situates this concept as part of the Internet of Things, or IoT, in which gadgets are connected to online servers and thus made more useful. “The Connected Officer,” Microsoft has written, will “bring IoT to policing.” With the Internet of Things, physical objects are assigned unique identifiers and transfer data over networks in an automated fashion. If a police officer draws a gun from its holster, for example, a notification can be sent over the network to alert other officers there may be danger. Real Time Crime Centers could then locate the officer on a map and monitor the situation from a command and control center. Source Here
Uhm, I guess it's really is all connected, isn’t it?
June 18, 2020: The Guardian • How Target, Google, Bank of America and Microsoft quietly fund police through private donations. More than 25 large corporations in the past three years have contributed funding to private police foundations, new report says. Source Here
Long live the Military Industrial Techno Surveillance State. If you have nothing to hide, than you have nothing to worry about. Really? Are we still believing that line? Cause it's a load of crap. If we have nothing to worry about, then why are they worried enough to be implementing surveillance systems with corresponding units on the ground? Got your attention there, didn't I?
August 19, 2019: Big Think • Though the term "Orwellian" easily applies to such a technology, Michel's illuminating reporting touches something deeper. Numerous American cities have already been surveilled using these god-like cameras, including Gorgon Stare, a camera-enabled drone that can track individuals over a 50-square kilometer radius from 20,000 feet. Here's the real rub: the feature that allows users to pinch and zoom on Instagram is similar to what WAMI allows. Anything within those 50-square kilometers is now under the microscope. If this sounds like some futuristic tech, think again: Derivations of this camera system have been tested in numerous American cities. Say there is a big public protest. With this camera you can follow thousands of protesters back to their homes. Now you have a list of the home addresses of all the people involved in a political movement. If on their way home you witness them committing some crime—breaking a traffic regulation or frequenting a location that is known to be involved in the drug trade—you can use that surveillance data against them to essentially shut them up. That's why we have laws that prevent the use of surveillance technologies because it is human instinct to abuse them. That's why we need controls. Source Here
Want to know more about the Gorgon Stare? Flatten the Curve. Part 12. Source Here
Now, I'm not sure if you remember or know any Greek Mythology, but the Gorgons were three sisters, and one sister had Snakes on her head (she wasn't a lawyer) and she turned people to stone when she looked at them.
MEDUSA (Mob Excess Deterrent Using Silent Audio) is a directed-energy non-lethal weapon designed by WaveBand Corporation in 2003-2004 for temporary personnel incapacitation. The weapon is based on the microwave auditory effect resulting in a strong sound sensation in the human head when it is subject to certain kinds of pulsed/modulated microwave radiation. The developers claimed that through the combination of pulse parameters and pulse power, it is possible to raise the auditory sensation to a “discomfort” level, deterring personnel from entering a protected perimeter or, if necessary, temporarily incapacitating particular individuals. In 2005, Sierra Nevada Corporation acquired WaveBand Corporation.
Ok. Get it? The Gorgon eye in the sky stares at you while the Medusa makes you immobile. Not good, but at least it'll just freeze you in your tracks.
July 6, 2008: Gizmodo • The Sierra Nevada Corporation claimed this week that it is ready to begin production on the MEDUSA, a damned scary ray gun that uses the "microwave audio effect" to implant sounds and perhaps even specific messages inside people's heads. Short for Mob Excess Deterrent Using Silent Audio, MEDUSA creates the audio effect with short microwave pulses. The pulses create a shockwave inside the skull that's detected by the ears, and basically makes you think you're going balls-to-the-wall batshit insane. Source Here
Uhm. And drive you insane.
July 26, 2008: Gizmodo • The MEDUSA crowd control ray gun we reported on earlier this month sounded like some pretty amazing-and downright scary-technology. Using the microwave auditory effect, the beam, in theory, would have put sounds and voice-like noises in your head, thereby driving you away from the area. Crowd control via voices in your head. Sounds cool. However, it turns out that the beam would actually kill you before any of that happy stuff started taking place, most likely by frying or cooking your brain inside your skull. Can you imagine if this thing made it out into the field? Awkward! Source Here
Annnnnnnndddddd it'll kill you.
Guys, they're prepared. They've been prepared. They're ready. Remember the Doomsday Bunkers? The military moving into Cheyenne Mountain? Deep Underground Military Bunkers? The rapid rolling out of 5G? BITCOIN and UBI so neatly inserted into our minds over the last five years? They've directly told us to have three months of supplies in our homes. 2020 isn't going to be an anomaly? It's the start of the collapse of our natural resources. Take a look on Reddit and all the posts about crazy weather. Cyanobacteria blooms killing dogs and people. Toxic Super Pollution caused by atmospheric inversions killing people. This isn’t normal, this is New Normal. And they know it. They've known it for a while. Let me show you one last thing before I wrap it up.
From the earliest Chinese dynasties to the present, the jade deposits most used were not only those of Khotan in the Western Chinese province of Xinjiang but other parts of China as well, such as Lantian, Shaanxi.
Remember, words matter. Look at Gorgon Stare and Medusa. They don't randomly grab names out of a hat, or pick them because they think it sounds dystopian. They pick words for a reason.
July 7, 2017: The Warzone • There only appears to be one official news story on this exercise at all and it's available on the website of Air Mobility Command’s Eighteenth Air Force, situated at Joint Base Charleston. At the time of writing, a google shows that there were more than a half dozen more copies on other Air Force pages, as well as number of photographs. For some reason, someone appears to have taken these offline or otherwise broken all the links. Using Google to search the Defense Video Imagery Distribution System, which is the main U.S. military's public affairs hub, brings up more broken links. Oh, and unless there's been some sort of mistake, JADE HELM actually stands for the amazingly obtuse Joint Assistance for Deployment Execution Homeland Eradication of Local Militants. A separate web search for this phrase does not turn up any other results. Source Here
Now, using an acronym that indicates training to Eradicate Local Militants seems pretty dumb. It may be used in that manner if environmental collapse triggers riots, but i don't think they would warn everyone ahead of time, do you? So I dug a little bit more.
Joint Assistant for Development and Execution (JADE) is a U.S. military system used for planning the deployment of military forces in crisis situations. The U.S. military developed this automated planning software system in order to expedite the creation of the detailed planning needed to deploy military forces for a military operation. JADE uses Artificial Intelligence (AI) technology combining user input, a knowledge base of stored plans, and suggestions by the system to provide the ability to develop large-scale and complex plans in minimal time. JADE is a knowledge-based system that uses highly structured information that takes advantage of data hierarchies. An official 2016 document approved for public release titled Human Systems Roadmap Review describes plans to create autonomous weapon systems that analyze social media and make decisions, including the use of lethal force, with minimal human involvement. This type of system is referred to as a Lethal Autonomous Weapon System (LAWS). The name "JADE" comes from the jade green color seen on the island of Oahu in Hawaii where the U.S. Pacific Command (PACOM) is headquartered.
PACOM? Why isn't that command group responsible for the South China Sea?
Formerly known as United States Pacific Command (USPACOM) since its inception, the command was renamed to U.S. Indo-Pacific Command on 30 May 2018, in recognition of the greater emphasis on South Asia, especially India.
Now doesn't it look like Jade Helm is preparing for an invasion? And possibly insurrection later. Or at the same time? Or riots over WW3? Or food riots? And start thinking about why the laws are starting to exclude corporations? Then think about the mercenaries that are being contracted out by the government.
October 17, 2018: The Carolinan • In 2016, 75 percent of American forces were private contractors. In 2017, Erik Prince, former head of Blackwater, and Stephen Feinberg, head of Dyncorp, discussed plans for contractors completely taking over U.S. operations in Afghanistan. Although ultimately unsuccessful, it remains to be seen if the current administration will change its mind. Contractors are involved in almost every military task, such as intelligence analysis, logistics and training allied soldiers. Contractors are even involved in U.S. special ops missions. This is because contractors are essentially untraceable and unaccountable. Most are born in other countries; only 33 percent are registered U.S. citizens. Private military firms don’t have to report their actions to Congress, unlike the military or intelligence agencies. They also aren’t subject to the Freedom of Information Act, so private citizens and journalists aren’t allowed to access their internal documents. There are also no international laws to regulate private military firms. It’s been proven that many contractors are involved in illegal activities. The larger multinational companies sometimes hire local subcontractors. These contractors sometimes aren’t background-checked. A 2010 investigation by the Senate found that many subcontractors were linked to murders, kidnappings, bribery and anti-coalition activities. Some subcontractors even formed their own unlicensed mercenary groups after coalition forces leave. A 2010 House investigation showed evidence that the Department of Defense had hired local warlords for security services. In 2007, Blackwater contractors massacred 17 civilians. This eventually led Blackwater to being restructured and renamed as Academi. Source Here
Military Exercises. Private Defense Firms. No oversight. And it's all coming soon. Read more at Flatten the Curve. Part 20. Upcoming war and catastrophes. Source Here
Nah. I'm just fear mongering and Doomscrolling again.
Heads up and eyes open. Talk soon.
submitted by biggreekgeek to conspiracy [link] [comments]

A new whitepaper analysing the performance and scalability of the Streamr pub/sub messaging Network is now available. Take a look at some of the fascinating key results in this introductory blog

A new whitepaper analysing the performance and scalability of the Streamr pub/sub messaging Network is now available. Take a look at some of the fascinating key results in this introductory blog

Streamr Network: Performance and Scalability Whitepaper


https://preview.redd.it/bstqyn43x4j51.png?width=2600&format=png&auto=webp&s=81683ca6303ab84ab898c096345464111d674ee5
The Corea milestone of the Streamr Network went live in late 2019. Since then a few people in the team have been working on an academic whitepaper to describe its design principles, position it with respect to prior art, and prove certain properties it has. The paper is now ready, and it has been submitted to the IEEE Access journal for peer review. It is also now published on the new Papers section on the project website. In this blog, I’ll introduce the paper and explain its key results. All the figures presented in this post are from the paper.
The reasons for doing this research and writing this paper were simple: many prospective users of the Network, especially more serious ones such as enterprises, ask questions like ‘how does it scale?’, ‘why does it scale?’, ‘what is the latency in the network?’, and ‘how much bandwidth is consumed?’. While some answers could be provided before, the Network in its currently deployed form is still small-scale and can’t really show a track record of scalability for example, so there was clearly a need to produce some in-depth material about the structure of the Network and its performance at large, global scale. The paper answers these questions.
Another reason is that decentralized peer-to-peer networks have experienced a new renaissance due to the rise in blockchain networks. Peer-to-peer pub/sub networks were a hot research topic in the early 2000s, but not many real-world implementations were ever created. Today, most blockchain networks use methods from that era under the hood to disseminate block headers, transactions, and other events important for them to function. Other megatrends like IoT and social media are also creating demand for new kinds of scalable message transport layers.

The latency vs. bandwidth tradeoff

The current Streamr Network uses regular random graphs as stream topologies. ‘Regular’ here means that nodes connect to a fixed number of other nodes that publish or subscribe to the same stream, and ‘random’ means that those nodes are selected randomly.
Random connections can of course mean that absurd routes get formed occasionally, for example a data point might travel from Germany to France via the US. But random graphs have been studied extensively in the academic literature, and their properties are not nearly as bad as the above example sounds — such graphs are actually quite good! Data always takes multiple routes in the network, and only the fastest route counts. The less-than-optimal routes are there for redundancy, and redundancy is good, because it improves security and churn tolerance.
There is an important parameter called node degree, which is the fixed number of nodes to which each node in a topology connects. A higher node degree means more duplication and thus more bandwidth consumption for each node, but it also means that fast routes are more likely to form. It’s a tradeoff; better latency can be traded for worse bandwidth consumption. In the following section, we’ll go deeper into analyzing this relationship.

Network diameter scales logarithmically

One useful metric to estimate the behavior of latency is the network diameter, which is the number of hops on the shortest path between the most distant pair of nodes in the network (i.e. the “longest shortest path”. The below plot shows how the network diameter behaves depending on node degree and number of nodes.

Network diameter
We can see that the network diameter increases logarithmically (very slowly), and a higher node degree ‘flattens the curve’. This is a property of random regular graphs, and this is very good — growing from 10,000 nodes to 100,000 nodes only increases the diameter by a few hops! To analyse the effect of the node degree further, we can plot the maximum network diameter using various node degrees:
Network diameter in network of 100 000 nodes
We can see that there are diminishing returns for increasing the node degree. On the other hand, the penalty (number of duplicates, i.e. bandwidth consumption), increases linearly with node degree:

Number of duplicates received by the non-publisher nodes
In the Streamr Network, each stream forms its own separate overlay network and can even have a custom node degree. This allows the owner of the stream to configure their preferred latency/bandwidth balance (imagine such a slider control in the Streamr Core UI). However, finding a good default value is important. From this analysis, we can conclude that:
  • The logarithmic behavior of network diameter leads us to hope that latency might behave logarithmically too, but since the number of hops is not the same as latency (in milliseconds), the scalability needs to be confirmed in the real world (see next section).
  • A node degree of 4 yields good latency/bandwidth balance, and we have selected this as the default value in the Streamr Network. This value is also used in all the real-world experiments described in the next section.
It’s worth noting that in such a network, the bandwidth requirement for publishers is determined by the node degree and not the number of subscribers. With a node degree 4 and a million subscribers, the publisher only uploads 4 copies of a data point, and the million subscribing nodes share the work of distributing the message among themselves. In contrast, a centralized data broker would need to push out a million copies.

Latency scales logarithmically

To see if actual latency scales logarithmically in real-world conditions, we ran large numbers of nodes in 16 different Amazon AWS data centers around the world. We ran experiments with network sizes between 32 to 2048 nodes. Each node published messages to the network, and we measured how long it took for the other nodes to get the message. The experiment was repeated 10 times for each network size.
The below image displays one of the key results of the paper. It shows a CDF (cumulative distribution function) of the measured latencies across all experiments. The y-axis runs from 0 to 1, i.e. 0% to 100%.
CDF of message propagation delay
From this graph we can easily read things like: in a 32 nodes network (blue line), 50% of message deliveries happened within 150 ms globally, and all messages were delivered in around 250 ms. In the largest network of 2048 nodes (pink line), 99% of deliveries happened within 362 ms globally.
To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! Decentralization comes with unquestionable benefits (no vendor lock-in, no trust required, network effects, etc.), but if such protocols are inferior in terms of performance or cost, they won’t get adopted. It’s pretty safe to say that the Streamr Network is on par with centralized services even when it comes to latency, which is usually the Achilles’ heel of P2P networks (think of how slow blockchains are!). And the Network will only get better with time.
Then we tackled the big question: does the latency behave logarithmically?
Mean message propagation delay in Amazon experiments
Above, the thick line is the average latency for each network size. From the graph, we can see that the latency grows logarithmically as the network size increases, which means excellent scalability.
The shaded area shows the difference between the best and worst average latencies in each repeat. Here we can see the element of chance at play; due to the randomness in which nodes become neighbours, some topologies are faster than others. Given enough repeats, some near-optimal topologies can be found. The difference between average topologies and the best topologies gives us a glimpse of how much room for optimisation there is, i.e. with a smarter-than-random topology construction, how much improvement is possible (while still staying in the realm of regular graphs)? Out of the observed topologies, the difference between the average and the best observed topology is between 5–13%, so not that much. Other subclasses of graphs, such as irregular graphs, trees, and so on, can of course unlock more room for improvement, but they are different beasts and come with their own disadvantages too.
It’s also worth asking: how much worse is the measured latency compared to the fastest possible latency, i.e. that of a direct connection? While having direct connections between a publisher and subscribers is definitely not scalable, secure, or often even feasible due to firewalls, NATs and such, it’s still worth asking what the latency penalty of peer-to-peer is.

Relative delay penalty in Amazon experiments
As you can see, this plot has the same shape as the previous one, but the y-axis is different. Here, we are showing the relative delay penalty (RDP). It’s the latency in the peer-to-peer network (shown in the previous plot), divided by the latency of a direct connection measured with the ping tool. So a direct connection equals an RDP value of 1, and the measured RDP in the peer-to-peer network is roughly between 2 and 3 in the observed topologies. It increases logarithmically with network size, just like absolute latency.
Again, given that latency is the Achilles’ heel of decentralized systems, that’s not bad at all. It shows that such a network delivers acceptable performance for the vast majority of use cases, only excluding the most latency-sensitive ones, such as online gaming or arbitrage trading. For most other use cases, it doesn’t matter whether it takes 25 or 75 milliseconds to deliver a data point.

Latency is predictable

It’s useful for a messaging system to have consistent and predictable latency. Imagine for example a smart traffic system, where cars can alert each other about dangers on the road. It would be pretty bad if, even minutes after publishing it, some cars still haven’t received the warning. However, such delays easily occur in peer-to-peer networks. Everyone in the crypto space has seen first-hand how plenty of Bitcoin or Ethereum nodes lag even minutes behind the latest chain state.
So we wanted to see whether it would be possible to estimate the latencies in the peer-to-peer network if the topology and the latencies between connected pairs of nodes are known. We applied Dijkstra’s algorithm to compute estimates for average latencies from the input topology data, and compared the estimates to the actual measured average latencies:
Mean message propagation delay in Amazon experiments
We can see that, at least in these experiments, the estimates seemed to provide a lower bound for the actual values, and the average estimation error was 3.5%. The measured value is higher than the estimated one because the estimation only considers network delays, while in reality there is also a little bit of a processing delay at each node.

Conclusion

The research has shown that the Streamr Network can be expected to deliver messages in roughly 150–350 milliseconds worldwide, even at a large scale with thousands of nodes subscribing to a stream. This is on par with centralized message brokers today, showing that the decentralized and peer-to-peer approach is a viable alternative for all but the most latency-sensitive applications.
It’s thrilling to think that by accepting a latency only 2–3 times longer than the latency of an unscalable and insecure direct connecion, applications can interconnect over an open fabric with global scalability, no single point of failure, no vendor lock-in, and no need to trust anyone — all that becomes available out of the box.
In the real-time data space, there are plenty of other aspects to explore, which we didn’t cover in this paper. For example, we did not measure throughput characteristics of network topologies. Different streams are independent, so clearly there’s scalability in the number of streams, and heavy streams can be partitioned, allowing each stream to scale too. Throughput is mainly limited, therefore, by the hardware and network connection used by the network nodes involved in a topology. Measuring the maximum throughput would basically be measuring the hardware as well as the performance of our implemented code. While interesting, this is not a high priority research target at this point in time. And thanks to the redundancy in the network, individual slow nodes do not slow down the whole topology; the data will arrive via faster nodes instead.
Also out of scope for this paper is analysing the costs of running such a network, including the OPEX for publishers and node operators. This is a topic of ongoing research, which we’re currently doing as part of designing the token incentive mechanisms of the Streamr Network, due to be implemented in a later milestone.
I hope that this blog has provided some insight into the fascinating results the team uncovered during this research. For a more in-depth look at the context of this work, and more detail about the research, we invite you to read the full paper.
If you have an interest in network performance and scalability from a developer or enterprise perspective, we will be hosting a talk about this research in the coming weeks, so keep an eye out for more details on the Streamr social media channels. In the meantime, feedback and comments are welcome. Please add a comment to this Reddit thread or email [[email protected]](mailto:[email protected]).
Originally published by. Henri at blog.streamr.network on August 24, 2020.
submitted by thamilton5 to streamr [link] [comments]

The 28 Best binary options trading infographics images ... Magic Money: The Bitcoin Revolution  Full Documentary ... Trading Bitcoin - The Dump, Then the Big Pump! Now What? How to Trade Bitcoin for Profit - A Crypto Trading Tutorial for Beginners Liquid Cooled Bitcoin Mining Farm Tour  Immersion Cooling ...

(Photo Illustration by Filip Radwanski/SOPA Images/LightRocket via Getty Images) Note that the current spike in bitcoin futures trading coincides with a huge spike in Robinhood account trading Browse 3,767 bitcoin mining stock photos and images available, or search for cryptocurrency or blockchain to find more great stock photos and pictures. Explore {{searchView.params.phrase}} by color family Find high-quality Bitcoin Trading stock photos and editorial news pictures from Getty Images. Download premium images you can't get anywhere else. Bitcoin Trading Techniquies. Charts analytics printed on tracing paper. Studio shot of a Bitcoin virtual currency placed on printed trade charts with custom graphic design. Moscow, Russia - April 10,... Download Online bitcoin trading images and photos. Over 10,530 Online bitcoin trading pictures to choose from, with no signup needed. Download in under 30 seconds.

[index] [726] [2152] [2255] [2698] [1397] [1384] [2341] [425] [490] [2326]

The 28 Best binary options trading infographics images ...

Check Out This Information: https://bit.ly/2Pw0UEM - The 28 Best binary options trading infographics images - Option Diaries You need to be able to reach cli... Trading Bitcoin - The Dump, Then the Big Pump! Now What? Tone Vays. ... Journeyman Pictures 595,909 views. 55:04. Ray Dalio on the Economy, Pandemic, China's Rise: Full Interview - Duration: 29:20. Exploring the revolutionary Bitcoin digital currency. It doesn't need banks or to be printed. It can be transferred in a second to anywhere in the world. Wit... Read The Full Info Here: https://bit.ly/33bv2NE - The Greatest Guide To 37 Best binary option images in 2020 - Binary, Option trading This is called being "out of the cash." The bid and deal vary ... VoskCoin Mining Farm Tour "Deeper in the mines" Season 1 Episode 1 featuring Marc of Upgradeya and his liquid cooled (immersion cooling) Bitcoin cryptocurren...

#