🌐
Reddit
reddit.com › r/crypto › how secure is des against non-nsa level adversaries?
r/crypto on Reddit: How secure is DES against non-NSA level ...
September 24, 2016 - A random guy with a desktop PC could crack your message fast enough to cause you problems. Someone with a botnet could crack your message in real time. 3DES is still considered secure, but there's absolutely no reason to use it unless you were using DES before and had to upgrade.
🌐
FasterCapital
fastercapital.com › content › Data-encryption-standard--Crypto-Confidence--Why-DES-Matters-for-Entrepreneurs.html
Data encryption standard: Crypto Confidence: Why DES Matters for ...
DES has been extensively tested and analyzed for its security and performance, and has proven to be resistant to most types of attacks, such as brute-force, differential, and linear cryptanalysis. DES also has a low error rate and a high degree of randomness, which reduces the chances of data corruption or leakage. - DES is outdated and vulnerable. DES was designed in an era when computing power was limited and data volumes were small. Today, however, DES is considered ...
🌐
Stack Exchange
crypto.stackexchange.com › questions › 16186 › is-tea-considered-secure
cryptanalysis - Is TEA considered secure? - Cryptography Stack ...

Yes, AFAIK the original TEA is safe and generally fine when

  • keys are random (key change should be atomic and with a fresh random key);
  • the 64-bit block size is not a concern (say, an operating mode other than ECB is used and the key is changed before a gigabyte worth of data);
  • the relative slowness is not a show-stopper;
  • side-channels are not an issue, or are taken care of (which is not especially hard; in particular TEA is inherently immune to timing attacks).

I'm not aware of any attack in a random key setup requiring much less work than $2^{126}$ encryptions (as required for brute force given known equivalent keys). The best result I know is Jiazhe Chen, Meiqin Wang and Bart Preneel's Impossible Differential Cryptanalysis of the Lightweight Block Ciphers TEA, XTEA and HIGHT (AfricaCrypt 2012) (alternate link to free, updated paper), which claims success against 17 rounds out of 64. This suggests there is some safety margin.

Vikram R. Andem's thesis (University of Chicago, 2003) also concluded TEA is safe, but the work is a bit outdated.


Answering the comment, reasons TEA is not much used:

  1. TEA came much after DES, which was standard and pushed by authorities (initially, because DES could be broken if needed due to its purposely small key size).
  2. 3DES is adequate for many uses, and most importantly is standard.
  3. TEA came significantly after IDEA, which was faster on most CPUs of the time.
  4. The security claims made by David J. Wheeler and Roger M. Needham in TEA, a Tiny Encryption Algorithm (proceedings of FSE 1994) where non-committing ("It is hoped it is safe") and came only with very basic justification.
  5. TEA was not defined in a manner suitable for interoperability: the mapping of octet strings to key and data has no standard definition (if I want TEA test vectors as octet strings endorsed by some official body, I do not know where to look! DES is much better in this regard). Even the number of rounds was not carved into stone: the article says "sixteen cycles may suffice and we suggest 32"; notice that this sentence recommends 64 rounds!
  6. TEA quickly turned out to be vulnerable under related-key attacks. This is not a problem when the key is random, but does not inspire confidence (especially since that, and the trivial equivalent keys, was not considered in the original article).
  7. XTEA was introduced shortly after TEA to fix related-keys attacks; XTEA did not meet its design goals, and needed a correction XXTEA; the later lost some of the simplicity of TEA, and requires slightly more rounds for equivalent security under random key when used as a 128-bit block cipher; also, XXTEA is badly broken in its wide-block variant (essentially because it uses too few rounds), see Elias Yarrkov's Cryptanalysis of XXTEA (2010).
  8. Each TEA key has a trivial equivalent key; normally this is a non-issue (when keys are 8-octet strings, DES is much worse); but amazingly, one of the few notable uses of TEA (as a building block in a poorly designed hash used in the original XBOX) fell precisely because of that!
  9. The above 4..8 tarnished the image of TEA (decision-maker often choose ciphers based on how they perceive its security or professionalism, or even on how they believe customers will perceive the choice!); 5 and 7 even created confusion about what TEA actually is (as illustrated by linking to the wrong algorithm in the initial question, and my mistake about the number of rounds in my initial answer).
  10. In hardware, TEA would be high-latency (and thus slow in CBC, OFB, CFB modes), because of its many rounds; as a mater of fact (and perhaps consequence) there is no hardware for TEA when there is for DES, 3DES and AES, including for the later in many modern CPUs; that often makes TEA much slower than competitors.
  11. Nowadays, 64-bit ciphers are passé.

Note: The most recent related-key attack on TEA that I know is by John Kelsey, Bruce Schneier, and David Wagner, Related-Key Cryptanalysis of 3-WAY, Biham-DES, CAST, DES-X, NewDES, RC2, and TEA (ICICS 1997); it claims success with only $2^{23}$ chosen plaintexts and single related key query, but of a kind quite unrealistic: the related key swaps the halves of the original, plus other minute changes. However attacks only get better, and any key used for TEA really should be random.

Answer from fgrieu on crypto.stackexchange.com
🌐
Theknowledgeacademy
theknowledgeacademy.com › blog › data-encryption-standard
What is Data Encryption Standard(DES)? Explained in Detail
Once reigning as the gold standard for encryption standard, DES was the go-to option for keeping information secure in the digital realm. But as technology evolved, so did the ingenuity of techniques to break its shield. · While the Data Encryption Standard is now considered outdated, it's ...
🌐
Grin
grin.com › shop › informatik - it-security
Data Encryption Standard (DES) and Issues of DES and its Replacement ...
Data Encryption Standard (DES) and Issues of DES and its Replacement - Computer Science / IT-Security - Scientific Essay 2022 - ebook - GRIN
Author: Haitham Ismail
Price: EUR 13.99
🌐
Townsendsecurity
info.townsendsecurity.com › bid › 72450 › What-are-the-Differences-Between-DES-and-AES-Encryption
What are the Differences Between DES and AES Encryption?
Anyone still choosing to use DES? We can help you move to AES and a solid encryption & key management solution! Learn more...
🌐
SSLInsights
sslinsights.com › home › wiki › what is des encryption: how it works?
What is DES Encryption: How DES Encryption Works?
October 7, 2024 - Block Size: A 64-bit block is considered small. Larger blocks are harder to break. S-Box Design: Some flaws allow optimizations to brute forcing. No Authenticity: Encrypts data but does not authenticate the source. Hardcoded S-Boxes: Makes analysis easier compared to dynamic S-boxes. While DES has held up relatively well, its age and design choices have yet to make it suitable for general use cases demanding high security...
🌐
Stack Exchange
crypto.stackexchange.com › questions › 1978 › how-big-an-rsa-key-is-considered-secure-today
public key - How big an RSA key is considered secure today? - ...

Since 2000, on a given $\text{year}$, no RSA key bigger than $(\text{year} - 2000) \cdot 32 + 512$ bits has been openly factored other than by exploitation of a flaw of the key generator (a pitfall observed in poorly implemented devices including Smart Cards). This linear estimate of academic factoring progress should not be used for choosing a key length so as to be safe from attacks with high confidence (or, equivalently, conforming to standards with that aim), a goal best served by this website on keylength.

The current factoring record is the 829-bit RSA-250 in late Feb. 2020, see the summary by the CADO-NFS team. That came shortly after the 795-bit RSA-240 in Dec. 2019, see the detailed paper.

I emphasize that the above is about attacks actually performed by academics. As far as we know, hackers have always been some years behind (see below). On the other hand, it is very conceivable that well-funded government agencies are many years ahead in the factoring game. They have the hardware and CPU time. And there are so many 1024-bit keys around that it is likely a worthwhile technique to be in a position to break these. It is one of the few credible and conjectured explanations for claims of a cryptanalytic breakthrough by the NSA. Also, dedicated hardware could change the picture someday; e.g. as outlined by Daniel Bernstein and Tanja Lange: Batch NFS (in proceedings of SAC 2014; also in Cryptology ePrint Archive, November 2014). Or in the distant future, quantum computers usable for cryptanalysis.

By 2020, the main practical threat to systems still using 1024-bit RSA to protect commercial assets often is not factorization of a public modulus; but rather, penetration of the IT infrastructure by other means, such as hacking, and trust in digital certificates issued to entities that should not be trusted. With 2048 bits or more we are safe from that factorization threat for perhaps two decades, with fair (but not absolute) confidence.

Factorization progress is best shown on a graph (to get at the raw data e.g. to make a better graph, edit this answer)

This also shows the linear approximation at the beginning of this answer, which actually is a conjecture at even odds for the [2000-2016] period that I made privately circa 2002 in the context of deciding if the European Digital Tachograph project should be postponed to upgrade its 1024-bit RSA crypto (still widely used today). I committed it publicly in 2004 (in French). Also pictured are the three single events that I know of hostile factorization of an RSA key (other than copycats of these events, or exploitation of flawed key generator):

  • The Blacknet PGP Key in 1995. Alec Muffett, Paul Leyland, Arjen Lenstra, and Jim Gillogly covertly factored a 384-bit RSA key that was used to PGP-encipher "the BlackNet message" spammed over many Usenet newsgroup. There was no monetary loss.

  • The French "YesCard" circa 1998. An individual factored the 321-bit key then used (even though it was clearly much too short) in issuer certificates for French debit/credit bank Smart Cards. By proxy of a lawyer, he contacted the card issuing authority, trying to monetize his work. In order to prove his point, he made a handful of counterfeit Smart Cards and actually used them in metro tickets vending machine(s). He was caught and got a 10 months suspended sentence (judgment in French). In 2000 the factorization of the same key was posted (in French) and soon after, counterfeit Smart Cards burgeoned. These worked with any PIN, hence the name YesCard (in French) (other account in English). For a while, they caused some monetary loss in vending machines.

  • The TI-83 Plus OS Signing Key in 2009. An individual factored the 512-bit key used to sign downloadable firmware in this calculator, easing installation of custom OS, thus making him a hero among enthusiasts of the machine. There was no direct monetary loss, but the manufacturer was apparently less than amused. Following that, many 512-bit keys (including those of other calculators) have been factored.

Note: 512-bit RSA was no longer providing substantial security by 2000-2005. Despite that, reportedly, certificates with this key size were issued until 2011 by official Certification Authorities and used to sign malware, possibly by means of a hostile factorization.

🌐
Tech-FAQ
tech-faq.com › home
DES (Data Encryption Standard) - Tech-FAQ
April 6, 2019 - Today, DES is considered to be too insecure for a number of transactions primarily due to the 56-bit key size being too small for modern technology. In January of 1999, the Electronic Frontier Foundation and distributed.net were able to collaborate to break a DES key in less than 24 hours (22 hours and 15 minutes). In the form of Triple DES, the algorithm is believed to be secure...
🌐
DBLP
dblp.org › home
dblp: Security Implications of Using the Data Encryption Standard ...
April 25, 2024 - Scott G. Kelly: Security Implications of Using the Data Encryption Standard (DES). RFC 4772: 1-28 (2006) Please note: Providing information about references and citations is only possible thanks to to the open metadata APIs provided by crossref.org and opencitations.net. If citation data of your publications is not openly available yet, then please consider ...
🌐
Stack Exchange
crypto.stackexchange.com › questions › 29995 › what-is-the-most-secure-encryption-algorithm-to-encrypt-my-password-database
What is the most secure encryption algorithm to encrypt my password ...

The number denotes the size of the key used - for a given algorithm, the larger the key, the more secure it is - hence AES-256 is better than AES-192 is better than AES-128.

The key size is measured in bits - so in terms of its protection against a brute force attack (attacker tries all possible keys) a 256 bit password is not twice as hard to break as a 128 bit password, it is 34000000000000000000000000000000000000 times as hard.

But there are other ways to attack encryption - exploiting mathemetical flaws/coincidences in the algorithm. Although 3DES uses 3 56 bit keys, the effective key size is only around 120 bits. But it is still considered better than RC2-128.

Which algorithm is the most secure?

Of those you have listed, AES-256

Does the algorithm provider matter?

No. In effect, it is just a list of available algorithms. But you do need to ensure the algorithm is available anywhere you want to decode the data.

Answer from symcbean on crypto.stackexchange.com
🌐
EITCA Academy
eitca.org › home › how did des serve as a foundation for modern encryption algorithms?
How did DES serve as a foundation for modern encryption algorithms?
August 3, 2023 - While DES itself is no longer considered secure due to advances in computing power and cryptanalysis techniques, its influence on modern encryption algorithms cannot be overstated. Many subsequent symmetric key algorithms, such as AES (Advanced Encryption Standard), draw inspiration from DES ...
🌐
Stack Overflow
stackoverflow.com › questions › 19098714 › why-des-is-insecure-how-do-we-know-when-to-stop-iterating-and-that-weve-found
c - Why DES is insecure? How do we know when to stop iterating ...

It's 256 not 255.

There are a couple of options for how to know when to stop. One is that you're doing a known-plain-text attack -- i.e., you know the actual text of a specific message, use that to learn the key, then use that to be able to read other messages. In many cases, you won't know the full text, but may know some pieces anyway -- for example, you may know something about an address block that's used with all messages of the type you care about, or if a file has been encrypted may have a recognizable header even though the actual content is unknown.

If you don't know (any of) the text for any message, you generally depend on the fact that natural languages are generally fairly redundant -- quite a bit is known about their structure. For a few examples, in English, you know that a space is generally the most common character, e is the most common letter, almost no word has more than two identical letters in a row, nearly all words contain at least one vowel, etc. In a typical case, you do a couple different levels of statistical analysis -- a really simple one that rules out most possibilities very quickly. For those that pass that test you do a second analysis that rules out the vast majority of the rest fairly quickly as well.

When you're done, it's possible you may need human judgement to choose between a few possibilities -- but in all honesty, that's fairly unusual. Statistical analysis is generally entirely adequate.

I should probably add that some people find statistical analysis problematic enough that they attempt to prevent it, such as by compressing the data with an algorithm like Huffman compression to maximize entropy in the compressed data.

Answer from Jerry Coffin on stackoverflow.com
🌐
InfoSec Insights
sectigostore.com › home › how does des encryption work in cryptography?
How Does DES Encryption Work in Cryptography? - InfoSec Insights
January 3, 2022 - How does DES encryption work? It involves breaking input data into 64-bit blocks to encrypt in rounds using cryptographic keys.
🌐
Symbiosis
symbiosisonlinepublishing.com › computer-science-technology › computerscience-information-technology32.php
A Comparison of Cryptographic Algorithms: DES, 3DES, AES, RSA and ...
We find hamming distance as sum of bit-by-bit xor considering ASCII value, as it becomes easy to implement programmatically. A high degree of diffusion i.e. high avalanche effect is desired. Avalanche effect reflects performance of cryptographic algorithm. v- Entropy: is the randomness collected by an application for use in cryptography that requires random data. A lack of entropy can have a negative impact on performance and security...
🌐
Quizlet
quizlet.com › 145525201 › encryption-types-flash-cards
Encryption types Flashcards | Quizlet
Study with Quizlet and memorize flashcards containing terms like WEP (Wired Equivalent Privacy), WPA (Wi-Fi Protected Access), WPA2 (Wi-Fi Protected Access 2) and more.
🌐
Stack Exchange
crypto.stackexchange.com › questions › 43544 › des-strength-and-weakness
DES strength and weakness - Cryptography Stack Exchange

The wikipedia article @SEJPM links to is about as high level of an overview as you can really get. We can elaborate on some of the points.

DES is weak against Brute force in this day and age.

Actually, it was weak against brute force pretty much as soon as it was standardized. According to the wikipedia article, the cipher was standardized in 1977. Reading further, in the section about brute force attacks:

In 1977, Diffie and Hellman proposed a machine costing an estimated US$20 million which could find a DES key in a single day.

So it is arguable that DES was never strong against brute force attacks (the standardized post-NSA consultation version anyways - originally the designers did propose a "real" key size).

Why was DES so strong originally

Let's look at the section of the article for attacks faster then brute force:

  • Differential cryptanalysis is one area where DES was relatively strong. It's understood that IBM and the NSA both knew about differential cryptanalysis when DES was designed, and chose to keep this information secret. There exists differential attack(s) against DES, but they are not as devastating as they are against some other ciphers (i.e. FEAL)
    • Resistance to differential attack is determined by differential characteristics of the S-box. The DES s-box was designed with this in mind.
    • A "differential" is a pair of differences: The difference between two inputs to the function, and the difference between the two corresponding outputs. The probability that this output difference occurs for a pair of inputs with the given difference is the basis for the attack.

Are there any DES specific weaknesses that can be practically exploited?

Continuing further in the same section of the article as before, they discuss linear cryptanalysis.

  • DES was not known to be written with defense against linear cryptanalysis in mind. It is arguable whether or not the mentioned attacks constitute a "practical" break; These attacks only require known plaintext-ciphertext pairs, which are actually not uncommon in practice (i.e. HTTP GET) and can be practical to obtain, but it does require a large number of such pairs.

    • It is possible to defend against linear cryptanalysis by choosing s-box values appropriately, similarly to differential cryptanalysis. "Linearity" can be a somewhat nebulous concept and difficult to understand.. Put possibly over simplified, the s-box should be as different as possible from any linear equation. Basically, it should be difficult to come up with a simple equation that accurately approximates the equation of the s-box.
  • DES has weak keys

    • A stronger key schedule should prevent weak keys
    • Weak keys are technically uncommon, but it's arguable that all 56 bit keys are weak
  • The 64 bit block size could be larger

    • This is technically more relevant to the security of the mode of operation used with the cipher then the cipher itself
Answer from Ella Rose on crypto.stackexchange.com
🌐
Quora
quora.com › Which-form-of-encryption-is-considered-more-secure-SHA1-or-MD5
Which form of encryption is considered more secure: SHA1 or MD5?
Answer: Neither is secure — and neither is encryption. SHA1 and MD5 are obsolete hash algorithms. Hashing is different from encryption because it’s not reversible. At one time SHA1 was a bit more secure than MD5, but both are outdated and insecure nowadays.
🌐
Testbook
testbook.com › home › full form › full form of des - data encryption standard | history, characteristics, advantages & disadvantages
Full Form of DES - Data Encryption Standard | History, Charact...
August 30, 2023 - Learn about the full form of DES, its history, characteristics, advantages, and disadvantages. Understand the importance of DES in the field of cryptography.
🌐
Reddit
reddit.com › r/crypto › does this des encryption have a weakness?
r/crypto on Reddit: Does this DES encryption have a weakness?

If you want message authentication you should run a real MAC like NIST CMAC (which you can run with a 64-bit block cipher).

You can truncate the output of CMAC if you are sensitive to space.