Factor this!
Message boards :
NFS Discussion :
Factor this!
Message board moderation
Author  Message 

Send message Joined: 24 May 10 Posts: 7 Credit: 1,956 RAC: 0 
682934139764871621059294670540409779256277785405904002174839913653988630704152088904136121991617552380023455259399723952023653153643548060833653509153580215597699356690840851264278901841510478481653190812472058220200188105772756389172416253 
Send message Joined: 24 May 10 Posts: 7 Credit: 1,956 RAC: 0 
1. Random generation using random primes (Factors = p96 * p145) 2. Wait, wait, wait.. you only factor numbers of the form b^n + 1? By that reasoning, ask that question to everyone participating. Sorry for trying to contribute to the site.. Sheesh. O_O 
Send message Joined: 26 Jun 08 Posts: 594 Credit: 251,957,170 RAC: 31,853 
Kevin, No offense is intended, at least from me anyway :), but there are a large number of unfactored numbers that are of interest to mathematicians, computer scientists, and laymen that will keep us busy for a very long time. We are mainly working on the Cunningham project numbers for now, but there are many other candidates of interest. Factoring a randomly generated semiprime with no special significance is not a good use of the volunteer computer time. Greg 
Send message Joined: 2 Oct 09 Posts: 50 Credit: 111,128,218 RAC: 0 
1. Random generation using random primes (Factors = p96 * p145) There must be some range in between "random numbers, hard to factor" and "numbers of the form b^k1" (including ones b^2n 1 = (b^n 1)(b^n +1)). You have a c240 or maybe c241 (96+145). Is the intended point that it's no easier to factor than a prospective RSA number p*q, with p and q both primes with 120digits? Since the current record for RSA numbers is RSA768, 233decimal digits; factoring the number you've posed isn't much different than asking for a new record factorization of an RSA number, with 240digits. I don'thave know your background, but have you looked at the resources needed for setting some of the previous records? Numbers of comparable difficulty (size, for gnfs) with the snfs numbers we're working on here might have 180digits; with 170digits notsodifficult, and 190digits harder than our current resources. Or perhaps you intend factoring your number to be easier than factoring a 240digit RSAkey due to the smaller p96 factor? There's already a challenge number of this type, with 1024bits (c. 310decimal), which is known to be of the form p1*p2*p3*p4, where each pi is a prime (i = 1,2,3,4) with 256bits. Factoring this previous challenge number by a method easier than factoring a 1024bit RSAkey requires a method, easier than sieving (gnfs or snfs) that would be able to make use of having a 77digit prime factor (c. 256bits). Perhaps this previous challenge is easier than yours (p77 instead of p96)? The person proposing this earlier challenge heads one of the top factoring research labs, and they have made some progress by finding p73 factors; setting a record for ecm, with the slight issue that the 73digit primes are quite far from being random; uhm, guess you're not going to be impressed, their primes all divide numbers 2^n 1. If you want random, the best we can do is 68digits. bdodson 
Send message Joined: 24 May 10 Posts: 7 Credit: 1,956 RAC: 0 
"No offense is intended, at least from me anyway :)" Excellent! :) "but there are a large number of unfactored numbers that are of interest to mathematicians, computer scientists, and laymen that will keep us busy for a very long time" The Cunningham project? You could just add to it by contributing a random b^n + 1 or a^n + b^n Ex: 5^300 + 1, or 6^429 + 11^429. "We are mainly working on the Cunningham project numbers for now, but there are many other candidates of interest." I think there are somewhere between 730 and 1200 of those.. That's going to keep you busy for a long time indeed, and that's not even factoring in the new a^n + b^n you contribute to the site each and every day. "Factoring a randomly generated semiprime with no special significance is not a good use of the volunteer computer time. " No special significance? That's outrageous! Perhaps, at some point, the number I posted is a cofactor of a number of the form a^n + b^n, or a^n + 1. That seems significant to me. Perhaps it's the cofactor of a number n where n^2 + 1 is prime. Perhaps it's the cofactor of a fibonacci number. The number can also be a cofactor to a Lucas number, the cofactor to an n^21 number, the cofactor to an n^2 + 1 number, among many (hypothetical) other properties that I can list. That doesn't seem "insignificant" to me. And, by that logic, apply the same reasoning to your own members, as I said before. Your reasoning, turned on you: "Factoring a randomly generated a^n + b^n number is not a good use of volunteer computer time." "You have a c240 or maybe c241 (96+145). Is the intended point that it's no easier to factor than a prospective RSA number p*q, with p and q both primes with 120digits?" It's c240. And, that point would be false, a difference in size between the prime factors makes it easier to factor. That can be determined by common sense alone. "Since the current record for RSA numbers is RSA768, 233decimal digits; factoring the number you've posed isn't much different than asking for a new record factorization of an RSA number, with 240digits." But this would not count, as it would be too easy (Because of the difference in size of the prime factors in question) "I don'thave know your background, but have you looked at the resources needed for setting some of the previous records?" All previous records and RSA768 were all factored using SNFS or GNFS. " Numbers of comparable difficulty (size, for gnfs) with the snfs numbers we're working on here might have 180digits; with 170digits notsodifficult, and 190digits harder than our current resources." You might as well begin trying to break someone else's RSA keys. (155 digits, or a typical 512bit RSA modulus, would be incredibly easy to do, by that reasoning.) "There's already a challenge number of this type, with 1024bits (c. 310decimal), which is known to be of the form p1*p2*p3*p4, where each pi is a prime (i = 1,2,3,4) with 256bits. Factoring this previous challenge number by a method easier than factoring a 1024bit RSAkey requires a method, easier than sieving (gnfs or snfs) that would be able to make use of having a 77digit prime factor (c. 256bits). Perhaps this previous challenge is easier than yours (p77 instead of p96)?" Four 77digit numbers? Wouldn't this be easier by searching for factors in the range of 10^76 or 10^77? And why not use GNFS/SNFS? It's your best method for numbers of that size. (It is unlikely that there will be any methods better than GNFS/SNFS for factoring.) And as for it being an easier challenge, I would be inclined to agree that it is easier, but it's four 77digit numbers. Repeating the process to find three of the four factors? That will take you a few months. Factoring p96 * p145 would be somewhat easier given the difference in factor size, and the fact that you only need to find one factor out of the two to complete the factorization. But it is 19 digits larger, so I'm guessing that the two factorizations would take an equal amount of time. "The person proposing this earlier challenge heads one of the top factoring research labs, and they have made some progress by finding p73 factors; setting a record for ecm, with the slight issue that the 73digit primes are quite far from being random" Hmm. Interesting. I think that a p80 might be found by ECM sometime soon. (By sometime soon, I mean 10 � 5 years.) And, of course, it was found from 2^11811. "uhm, guess you're not going to be impressed, their primes all divide numbers 2^n 1. If you want random, the best we can do is 68digits." This is wellknown, and.. 68 digits? I can do that in 5 minutes! 
Send message Joined: 26 Jun 08 Posts: 594 Credit: 251,957,170 RAC: 31,853 
a^n + b^n Those are the homogeneous Cunningham numbers. See http://www.leyland.vispa.com/numth/factorization/anbn/main.htm No special significance? That's outrageous! Perhaps, at some point, the number I posted is a cofactor of a number of the form ... Perhaps, but likely not. "You have a c240 or maybe c241 (96+145). Actually, in this case "common sense" is leading you astray. If the smaller number was 6065 digits or less, then ECM would make factoring the number much easier. However, a C240 = P96 * P145 is no easier than a C240 = P120 * P120. And if the number has no special form, then it is harder than the current record GNFS factorization. You might as well begin trying to break someone else's RSA keys. (155 digits, or a typical 512bit RSA modulus, would be incredibly easy to do, by that reasoning.) And in fact it is. That is a large part of the point of this project. RSA512 is totally broken. There has been one successful factorization of RSA768, and we hope to improve our software to be able to factor more numbers of that size soon. Four 77digit numbers? Wouldn't this be easier by searching for factors in the range of 10^76 or 10^77? No. There are approximately 5*10^74 77digit prime numbers. If you could test one every clock cycle (which is nowhere near possible) a 2GHz computer would take 8*10^57 CPUyears, or about 10^47 times the age of the universe. And why not use GNFS/SNFS? It's your best method for numbers of that size. That size is currently out of reach of GNFS. But there are people working on it. This is wellknown, and.. 68 digits? I can do that in 5 minutes! He is referring to a factor of 68digits, not a composite of 68digits. The current world record ECM factor of a number of a form other than 2^n +/ 1 is 68digits. 
Send message Joined: 2 Oct 09 Posts: 50 Credit: 111,128,218 RAC: 0 
... OK, you're officially wasting the time of the people reading your posts. The numbers on each of the 18 Cunningham lists have specific limits on n, given in terms of b. There have been no new numbers added for some years; and the most recent "official" count was 458 on April 10. (I'm kidding on the first official, not the second; I'm not an official on either the Cunningham project or NFS@Home.) ... One of us lacks an elementary education in the algorithms of modern computational number theory.
Again.
False. I'm a coauthor of the paper on the factorization of RSA120; we used QS (cf. MPQS, ppmpqs, ppmpqs with larger large primes). Likewise RSA100 and RSA110. Perhaps the most famous of all record factorizations was the original challenge number of RivestAddlemanShamir, which appeared in a famous math puzzle column in Scientic American, the first public description of the RSA cryptographic system, RSA129. Also done by QS (with ppmpqs and larger large primes). So we now know, beyond any doubt the state of your knowledge of previous records
Again. I did. As reported in the paper on the factorization of RSA155, Eurocrypt 2000. More than 10 years ago (with factors found in 1999). That was stateoftheart at that time; just as RSA768 is the current stateoftheart. What is the point of this comment? Yes, 512bits, 155decimal digits is easy now. My statement (the one quoted) is a correct description of the current resources of NFS@Home; which are somewhat larger than the projects of Batalov+Dodson, where we recently did a c180 gnfs.
Again. An undergraduate class in elementary number theory would usually include at least the statement of the prime number theorem (I advised an undergrad honors project in which the student presented the proof; currently pursuing graduate studies at Univ Wisc.). There are too many primes smaller than 10^70 to search them. More than the number of atoms in the universe. You're decribing a computation with exponential runtime. You've taken calculus? All of QS, ECM and NFS are subexponential methods. ECM has runtime that does depend on the size of the smallest prime factor. The record size prime is 68digits for general numbers, 73digits for 2^n1. Not so QS and NFS, each of which would take weeks to find a factor of 3 in a 180digit composite. This was reported in Time magazine in 1983; it's not even number theory anymore, basic knowledge of educated people. I'll try again. Your view of the effect of the "difference in size of prime factors in question" is not the view of a talented amateur. You may very well be bright, but your comments do more to confirm that "a little knowledge is a dangerous thing" than to a positive contribution to our project. Unofficially speaking. bdodson

Send message Joined: 24 May 10 Posts: 7 Credit: 1,956 RAC: 0 
"Those are the homogeneous Cunningham numbers. See http://www.leyland.vispa.com/numth/factorization/anbn/main.htm " Is this where there is a list of 732 of them? "Perhaps, but likely not." Nope. There's a 100% chance that it has at least one of the properties I listed. "Actually, in this case "common sense" is leading you astray. If the smaller number was 6065 digits or less, then ECM would make factoring the number much easier. However, a C240 = P96 * P145 is no easier than a C240 = P120 * P120. And if the number has no special form, then it is harder than the current record GNFS factorization." If the smaller factor were in fact a c31 to c36, any average joe would be able to factor it in about an hour using ECM. How is it equally difficult? A difference in size of 49 digits makes it in fact far easier than p120 * 120. Think of it this way: Suppose you had to sift through 96 objects to find a certain object out of 240 in one pile, and you had to sift through 120 objects in another pile. It'll be easier to sift through 96 objects than 120 objects. In fact, it would be faster attempting to pick out a p96 factor than it would be trying to pick out a p120 factor. The point of that (somewhat mediocre) analogy is that semiprimes whose factors are of identical size are more difficult to factor. That's the main reason why RSA moduli are generated using primes of similar/identical size. Because it's more difficult to factor a semiprime composed of primes of similar or identical size than it is to factor a semiprime whose prime factors vary in size. Because the running time needed for a factoring algorithm depends on the size of the smallest prime factor (Almost all (if not all) factoring methods work this way.). You can't dismiss the painfully obvious. : "And in fact it is. That is a large part of the point of this project. RSA512 is totally broken. There has been one successful factorization of RSA768, and we hope to improve our software to be able to factor more numbers of that size soon." RSA768 style numbers? I think you should give it 15 � 10 years. "That size is currently out of reach of GNFS. But there are people working on it." If they're not using SNFS, I don't know what it is that they're using.. Fermat's, Dixon's, Trial division, MPQS, QS, SIQS are not recommended for p77 factors... "He is referring to a factor of 68digits, not a composite of 68digits. The current world record ECM factor of a number of a form other than 2^n +/ 1 is 68digits." Is this referring to a semiprime? If not, can you list ECM's world record for that? 
Send message Joined: 26 Jun 08 Posts: 594 Credit: 251,957,170 RAC: 31,853 
A difference in size of 49 digits makes it in fact far easier than p120 * 120. Think of it this way: Suppose you had to sift through 96 objects to find a certain object out of 240 in one pile, and you had to sift through 120 objects in another pile. It'll be easier to sift through 96 objects than 120 objects. In fact, it would be faster attempting to pick out a p96 factor than it would be trying to pick out a p120 factor. The point of that (somewhat mediocre) analogy is that semiprimes whose factors are of identical size are more difficult to factor. That's the main reason why RSA moduli are generated using primes of similar/identical size. Because it's more difficult to factor a semiprime composed of primes of similar or identical size than it is to factor a semiprime whose prime factors vary in size. Because the running time needed for a factoring algorithm depends on the size of the smallest prime factor (Almost all (if not all) factoring methods work this way.). You can't dismiss the painfully obvious. : Just for the record for others reading this thread, this is simply wrong. :) The run time for some algorithms do mostly depend on the size of the penultimate factor, such as trial factoring, P+/1, and ECM, and others depend only on the size of the number to be factored and not on the size of the factors, such as QS, SNFS, and GNFS. Even having equal size factors that are too close together can make the number easy to factor by the Fermat method. For the case of P96 * P145, it is in fact no easier than P120 * P120. Now back to our regularly scheduled sieving... 
Send message Joined: 24 May 10 Posts: 7 Credit: 1,956 RAC: 0 
I took the time to research.. It turns out I was incorrect on all counts. A sad day for me. The BOINC program slowed my computer to a crawl, to be honest.. 
Send message Joined: 24 May 10 Posts: 7 Credit: 1,956 RAC: 0 
"Even having equal size factors that are too close together can make the number easy to factor by the Fermat method." True.. But that's only if it's within sqrt(n) (n^(1/4)) Also, there are nonmath means to break an RSA key. (Ex: Timing attacks) 